44842 1727204489.30191: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 44842 1727204489.30596: Added group all to inventory 44842 1727204489.30598: Added group ungrouped to inventory 44842 1727204489.30602: Group all now contains ungrouped 44842 1727204489.30604: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 44842 1727204489.52059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 44842 1727204489.52123: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 44842 1727204489.52173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 44842 1727204489.52306: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 44842 1727204489.52943: Loaded config def from plugin (inventory/script) 44842 1727204489.52946: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 44842 1727204489.52986: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 44842 1727204489.53065: Loaded config def from plugin (inventory/yaml) 44842 1727204489.53372: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 44842 1727204489.53535: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 44842 1727204489.54049: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 44842 1727204489.54053: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 44842 1727204489.54056: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 44842 1727204489.54066: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 44842 1727204489.54071: Loading data from /tmp/network-M6W/inventory-5vW.yml 44842 1727204489.54142: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 44842 1727204489.54213: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 44842 1727204489.54254: Loading data from /tmp/network-M6W/inventory-5vW.yml 44842 1727204489.54342: group all already in inventory 44842 1727204489.54349: set inventory_file for managed-node1 44842 1727204489.54354: set inventory_dir for managed-node1 44842 1727204489.54355: Added host managed-node1 to inventory 44842 1727204489.54357: Added host managed-node1 to group all 44842 1727204489.54358: set ansible_host for managed-node1 44842 1727204489.54359: set ansible_ssh_extra_args for managed-node1 44842 1727204489.55053: set inventory_file for managed-node2 44842 1727204489.55057: set inventory_dir for managed-node2 44842 1727204489.55058: Added host managed-node2 to inventory 44842 1727204489.55060: Added host managed-node2 to group all 44842 1727204489.55066: set ansible_host for managed-node2 44842 1727204489.55067: set ansible_ssh_extra_args for managed-node2 44842 1727204489.55070: set inventory_file for managed-node3 44842 1727204489.55073: set inventory_dir for managed-node3 44842 1727204489.55074: Added host managed-node3 to inventory 44842 1727204489.55076: Added host managed-node3 to group all 44842 1727204489.55077: set ansible_host for managed-node3 44842 1727204489.55077: set ansible_ssh_extra_args for managed-node3 44842 1727204489.55080: Reconcile groups and hosts in inventory. 44842 1727204489.55084: Group ungrouped now contains managed-node1 44842 1727204489.55086: Group ungrouped now contains managed-node2 44842 1727204489.55088: Group ungrouped now contains managed-node3 44842 1727204489.55172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 44842 1727204489.55305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 44842 1727204489.55354: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 44842 1727204489.55388: Loaded config def from plugin (vars/host_group_vars) 44842 1727204489.55391: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 44842 1727204489.55398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 44842 1727204489.55406: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 44842 1727204489.55450: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 44842 1727204489.55803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204489.55917: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 44842 1727204489.55960: Loaded config def from plugin (connection/local) 44842 1727204489.55967: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 44842 1727204489.56558: Loaded config def from plugin (connection/paramiko_ssh) 44842 1727204489.56565: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 44842 1727204489.57516: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44842 1727204489.57555: Loaded config def from plugin (connection/psrp) 44842 1727204489.57559: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 44842 1727204489.58377: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44842 1727204489.58416: Loaded config def from plugin (connection/ssh) 44842 1727204489.58419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 44842 1727204489.58799: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44842 1727204489.58837: Loaded config def from plugin (connection/winrm) 44842 1727204489.58841: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 44842 1727204489.58876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 44842 1727204489.58942: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 44842 1727204489.59011: Loaded config def from plugin (shell/cmd) 44842 1727204489.59013: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 44842 1727204489.59042: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 44842 1727204489.59110: Loaded config def from plugin (shell/powershell) 44842 1727204489.59112: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 44842 1727204489.59170: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 44842 1727204489.59341: Loaded config def from plugin (shell/sh) 44842 1727204489.59343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 44842 1727204489.59380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 44842 1727204489.59502: Loaded config def from plugin (become/runas) 44842 1727204489.59505: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 44842 1727204489.59726: Loaded config def from plugin (become/su) 44842 1727204489.59728: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 44842 1727204489.59897: Loaded config def from plugin (become/sudo) 44842 1727204489.59900: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 44842 1727204489.59933: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 44842 1727204489.60281: in VariableManager get_vars() 44842 1727204489.60303: done with get_vars() 44842 1727204489.60454: trying /usr/local/lib/python3.12/site-packages/ansible/modules 44842 1727204489.64011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 44842 1727204489.64147: in VariableManager get_vars() 44842 1727204489.64152: done with get_vars() 44842 1727204489.64155: variable 'playbook_dir' from source: magic vars 44842 1727204489.64156: variable 'ansible_playbook_python' from source: magic vars 44842 1727204489.64156: variable 'ansible_config_file' from source: magic vars 44842 1727204489.64157: variable 'groups' from source: magic vars 44842 1727204489.64158: variable 'omit' from source: magic vars 44842 1727204489.64159: variable 'ansible_version' from source: magic vars 44842 1727204489.64160: variable 'ansible_check_mode' from source: magic vars 44842 1727204489.64161: variable 'ansible_diff_mode' from source: magic vars 44842 1727204489.64162: variable 'ansible_forks' from source: magic vars 44842 1727204489.64162: variable 'ansible_inventory_sources' from source: magic vars 44842 1727204489.64163: variable 'ansible_skip_tags' from source: magic vars 44842 1727204489.64166: variable 'ansible_limit' from source: magic vars 44842 1727204489.64167: variable 'ansible_run_tags' from source: magic vars 44842 1727204489.64168: variable 'ansible_verbosity' from source: magic vars 44842 1727204489.64205: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml 44842 1727204489.65327: in VariableManager get_vars() 44842 1727204489.65344: done with get_vars() 44842 1727204489.65384: in VariableManager get_vars() 44842 1727204489.65399: done with get_vars() 44842 1727204489.65434: in VariableManager get_vars() 44842 1727204489.65448: done with get_vars() 44842 1727204489.65524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 44842 1727204489.65642: in VariableManager get_vars() 44842 1727204489.65656: done with get_vars() 44842 1727204489.65661: variable 'omit' from source: magic vars 44842 1727204489.65682: variable 'omit' from source: magic vars 44842 1727204489.65718: in VariableManager get_vars() 44842 1727204489.65728: done with get_vars() 44842 1727204489.65780: in VariableManager get_vars() 44842 1727204489.65794: done with get_vars() 44842 1727204489.65831: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44842 1727204489.66059: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44842 1727204489.66190: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44842 1727204489.66867: in VariableManager get_vars() 44842 1727204489.66887: done with get_vars() 44842 1727204489.67283: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44842 1727204489.69945: in VariableManager get_vars() 44842 1727204489.69948: done with get_vars() 44842 1727204489.69950: variable 'playbook_dir' from source: magic vars 44842 1727204489.69951: variable 'ansible_playbook_python' from source: magic vars 44842 1727204489.69952: variable 'ansible_config_file' from source: magic vars 44842 1727204489.69953: variable 'groups' from source: magic vars 44842 1727204489.69953: variable 'omit' from source: magic vars 44842 1727204489.69954: variable 'ansible_version' from source: magic vars 44842 1727204489.69955: variable 'ansible_check_mode' from source: magic vars 44842 1727204489.69956: variable 'ansible_diff_mode' from source: magic vars 44842 1727204489.69956: variable 'ansible_forks' from source: magic vars 44842 1727204489.69957: variable 'ansible_inventory_sources' from source: magic vars 44842 1727204489.69958: variable 'ansible_skip_tags' from source: magic vars 44842 1727204489.69959: variable 'ansible_limit' from source: magic vars 44842 1727204489.69959: variable 'ansible_run_tags' from source: magic vars 44842 1727204489.69960: variable 'ansible_verbosity' from source: magic vars 44842 1727204489.69995: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 44842 1727204489.70072: in VariableManager get_vars() 44842 1727204489.70075: done with get_vars() 44842 1727204489.70077: variable 'playbook_dir' from source: magic vars 44842 1727204489.70078: variable 'ansible_playbook_python' from source: magic vars 44842 1727204489.70079: variable 'ansible_config_file' from source: magic vars 44842 1727204489.70080: variable 'groups' from source: magic vars 44842 1727204489.70080: variable 'omit' from source: magic vars 44842 1727204489.70081: variable 'ansible_version' from source: magic vars 44842 1727204489.70082: variable 'ansible_check_mode' from source: magic vars 44842 1727204489.70083: variable 'ansible_diff_mode' from source: magic vars 44842 1727204489.70083: variable 'ansible_forks' from source: magic vars 44842 1727204489.70084: variable 'ansible_inventory_sources' from source: magic vars 44842 1727204489.70085: variable 'ansible_skip_tags' from source: magic vars 44842 1727204489.70086: variable 'ansible_limit' from source: magic vars 44842 1727204489.70086: variable 'ansible_run_tags' from source: magic vars 44842 1727204489.70087: variable 'ansible_verbosity' from source: magic vars 44842 1727204489.70119: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 44842 1727204489.70186: in VariableManager get_vars() 44842 1727204489.70198: done with get_vars() 44842 1727204489.70239: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44842 1727204489.70346: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44842 1727204489.70419: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44842 1727204489.70805: in VariableManager get_vars() 44842 1727204489.70825: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44842 1727204489.72423: in VariableManager get_vars() 44842 1727204489.72437: done with get_vars() 44842 1727204489.72476: in VariableManager get_vars() 44842 1727204489.72479: done with get_vars() 44842 1727204489.72482: variable 'playbook_dir' from source: magic vars 44842 1727204489.72483: variable 'ansible_playbook_python' from source: magic vars 44842 1727204489.72484: variable 'ansible_config_file' from source: magic vars 44842 1727204489.72484: variable 'groups' from source: magic vars 44842 1727204489.72485: variable 'omit' from source: magic vars 44842 1727204489.72486: variable 'ansible_version' from source: magic vars 44842 1727204489.72486: variable 'ansible_check_mode' from source: magic vars 44842 1727204489.72487: variable 'ansible_diff_mode' from source: magic vars 44842 1727204489.72488: variable 'ansible_forks' from source: magic vars 44842 1727204489.72489: variable 'ansible_inventory_sources' from source: magic vars 44842 1727204489.72489: variable 'ansible_skip_tags' from source: magic vars 44842 1727204489.72490: variable 'ansible_limit' from source: magic vars 44842 1727204489.72491: variable 'ansible_run_tags' from source: magic vars 44842 1727204489.72492: variable 'ansible_verbosity' from source: magic vars 44842 1727204489.72523: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 44842 1727204489.72608: in VariableManager get_vars() 44842 1727204489.72620: done with get_vars() 44842 1727204489.72660: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44842 1727204489.72785: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44842 1727204489.72865: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44842 1727204489.73261: in VariableManager get_vars() 44842 1727204489.73281: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44842 1727204489.75052: in VariableManager get_vars() 44842 1727204489.75068: done with get_vars() 44842 1727204489.75104: in VariableManager get_vars() 44842 1727204489.75116: done with get_vars() 44842 1727204489.75152: in VariableManager get_vars() 44842 1727204489.75166: done with get_vars() 44842 1727204489.75230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 44842 1727204489.75245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 44842 1727204489.75704: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 44842 1727204489.78307: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 44842 1727204489.78310: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 44842 1727204489.78360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 44842 1727204489.78390: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 44842 1727204489.78577: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 44842 1727204489.78640: Loaded config def from plugin (callback/default) 44842 1727204489.78642: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44842 1727204489.82900: Loaded config def from plugin (callback/junit) 44842 1727204489.82904: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44842 1727204489.82959: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 44842 1727204489.83031: Loaded config def from plugin (callback/minimal) 44842 1727204489.83034: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44842 1727204489.83076: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 44842 1727204489.83137: Loaded config def from plugin (callback/tree) 44842 1727204489.83140: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 44842 1727204489.83274: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 44842 1727204489.83277: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_routing_rules_nm.yml ******************************************* 6 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 44842 1727204489.83305: in VariableManager get_vars() 44842 1727204489.83321: done with get_vars() 44842 1727204489.83326: in VariableManager get_vars() 44842 1727204489.83335: done with get_vars() 44842 1727204489.83345: variable 'omit' from source: magic vars 44842 1727204489.83389: in VariableManager get_vars() 44842 1727204489.83974: done with get_vars() 44842 1727204489.84001: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_routing_rules.yml' with nm as provider] **** 44842 1727204489.84915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 44842 1727204489.84999: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 44842 1727204489.86084: getting the remaining hosts for this loop 44842 1727204489.86086: done getting the remaining hosts for this loop 44842 1727204489.86094: getting the next task for host managed-node1 44842 1727204489.86097: done getting next task for host managed-node1 44842 1727204489.86099: ^ task is: TASK: Gathering Facts 44842 1727204489.86101: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204489.86103: getting variables 44842 1727204489.86104: in VariableManager get_vars() 44842 1727204489.86114: Calling all_inventory to load vars for managed-node1 44842 1727204489.86116: Calling groups_inventory to load vars for managed-node1 44842 1727204489.86118: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204489.86130: Calling all_plugins_play to load vars for managed-node1 44842 1727204489.86140: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204489.86143: Calling groups_plugins_play to load vars for managed-node1 44842 1727204489.86179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204489.86230: done with get_vars() 44842 1727204489.86236: done getting variables 44842 1727204489.86300: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Tuesday 24 September 2024 15:01:29 -0400 (0:00:00.031) 0:00:00.031 ***** 44842 1727204489.86320: entering _queue_task() for managed-node1/gather_facts 44842 1727204489.86322: Creating lock for gather_facts 44842 1727204489.86632: worker is 1 (out of 1 available) 44842 1727204489.86641: exiting _queue_task() for managed-node1/gather_facts 44842 1727204489.86653: done queuing things up, now waiting for results queue to drain 44842 1727204489.86655: waiting for pending results... 44842 1727204489.87006: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44842 1727204489.87213: in run() - task 0affcd87-79f5-aad0-d242-0000000000af 44842 1727204489.87232: variable 'ansible_search_path' from source: unknown 44842 1727204489.87278: calling self._execute() 44842 1727204489.87344: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204489.87437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204489.87453: variable 'omit' from source: magic vars 44842 1727204489.87677: variable 'omit' from source: magic vars 44842 1727204489.87710: variable 'omit' from source: magic vars 44842 1727204489.87751: variable 'omit' from source: magic vars 44842 1727204489.87864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204489.87909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204489.87939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204489.87970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204489.87991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204489.88026: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204489.88034: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204489.88043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204489.88146: Set connection var ansible_shell_type to sh 44842 1727204489.88166: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204489.88184: Set connection var ansible_connection to ssh 44842 1727204489.88201: Set connection var ansible_pipelining to False 44842 1727204489.88211: Set connection var ansible_timeout to 10 44842 1727204489.88223: Set connection var ansible_shell_executable to /bin/sh 44842 1727204489.88255: variable 'ansible_shell_executable' from source: unknown 44842 1727204489.88269: variable 'ansible_connection' from source: unknown 44842 1727204489.88278: variable 'ansible_module_compression' from source: unknown 44842 1727204489.88286: variable 'ansible_shell_type' from source: unknown 44842 1727204489.88296: variable 'ansible_shell_executable' from source: unknown 44842 1727204489.88307: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204489.88314: variable 'ansible_pipelining' from source: unknown 44842 1727204489.88320: variable 'ansible_timeout' from source: unknown 44842 1727204489.88327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204489.88580: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204489.88598: variable 'omit' from source: magic vars 44842 1727204489.88609: starting attempt loop 44842 1727204489.88626: running the handler 44842 1727204489.88651: variable 'ansible_facts' from source: unknown 44842 1727204489.88682: _low_level_execute_command(): starting 44842 1727204489.88696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204489.89528: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204489.89544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204489.89560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204489.89589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204489.89643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204489.89657: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204489.89677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204489.89700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204489.89719: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204489.89732: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204489.89746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204489.89761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204489.89785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204489.89798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204489.89810: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204489.89831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204489.89911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204489.89944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204489.89969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204489.90077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204489.91748: stdout chunk (state=3): >>>/root <<< 44842 1727204489.91945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204489.91948: stdout chunk (state=3): >>><<< 44842 1727204489.91951: stderr chunk (state=3): >>><<< 44842 1727204489.92056: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204489.92061: _low_level_execute_command(): starting 44842 1727204489.92066: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517 `" && echo ansible-tmp-1727204489.9197688-44893-179572236939517="` echo /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517 `" ) && sleep 0' 44842 1727204489.93278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204489.93300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204489.93312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204489.93327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204489.93368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204489.93380: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204489.93397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204489.93413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204489.93422: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204489.93431: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204489.93440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204489.93450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204489.93467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204489.93477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204489.93486: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204489.93501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204489.93581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204489.93605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204489.93620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204489.93701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204489.95570: stdout chunk (state=3): >>>ansible-tmp-1727204489.9197688-44893-179572236939517=/root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517 <<< 44842 1727204489.95768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204489.95772: stdout chunk (state=3): >>><<< 44842 1727204489.95774: stderr chunk (state=3): >>><<< 44842 1727204489.95976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204489.9197688-44893-179572236939517=/root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204489.95980: variable 'ansible_module_compression' from source: unknown 44842 1727204489.95983: ANSIBALLZ: Using generic lock for ansible.legacy.setup 44842 1727204489.95985: ANSIBALLZ: Acquiring lock 44842 1727204489.95987: ANSIBALLZ: Lock acquired: 140164881036544 44842 1727204489.95989: ANSIBALLZ: Creating module 44842 1727204490.31367: ANSIBALLZ: Writing module into payload 44842 1727204490.31565: ANSIBALLZ: Writing module 44842 1727204490.31599: ANSIBALLZ: Renaming module 44842 1727204490.31615: ANSIBALLZ: Done creating module 44842 1727204490.31675: variable 'ansible_facts' from source: unknown 44842 1727204490.31687: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204490.31702: _low_level_execute_command(): starting 44842 1727204490.31712: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 44842 1727204490.32431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204490.32446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204490.32460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.32489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.32543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204490.32555: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204490.32575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.32593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204490.32605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204490.32620: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204490.32639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204490.32653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.32674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.32687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204490.32698: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204490.32710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.32788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204490.32811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204490.32827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204490.33168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204490.34494: stdout chunk (state=3): >>>PLATFORM <<< 44842 1727204490.34609: stdout chunk (state=3): >>>Linux <<< 44842 1727204490.34613: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 44842 1727204490.34825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204490.34828: stdout chunk (state=3): >>><<< 44842 1727204490.34830: stderr chunk (state=3): >>><<< 44842 1727204490.34872: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204490.34881 [managed-node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 44842 1727204490.34995: _low_level_execute_command(): starting 44842 1727204490.34999: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 44842 1727204490.35416: Sending initial data 44842 1727204490.35420: Sent initial data (1181 bytes) 44842 1727204490.35891: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204490.35895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.35929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204490.35932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.35935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.36003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204490.36998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204490.37045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204490.40816: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 44842 1727204490.41370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204490.41373: stdout chunk (state=3): >>><<< 44842 1727204490.41376: stderr chunk (state=3): >>><<< 44842 1727204490.41378: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204490.41380: variable 'ansible_facts' from source: unknown 44842 1727204490.41382: variable 'ansible_facts' from source: unknown 44842 1727204490.41384: variable 'ansible_module_compression' from source: unknown 44842 1727204490.41427: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44842 1727204490.41455: variable 'ansible_facts' from source: unknown 44842 1727204490.41616: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517/AnsiballZ_setup.py 44842 1727204490.42502: Sending initial data 44842 1727204490.42506: Sent initial data (154 bytes) 44842 1727204490.44893: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204490.45004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.45020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.45058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.45068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204490.45109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.45217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204490.45223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.45303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204490.45317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204490.45333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204490.45410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204490.47105: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 44842 1727204490.47109: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204490.47158: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204490.47214: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp258d578a /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517/AnsiballZ_setup.py <<< 44842 1727204490.47268: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204490.50620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204490.50715: stderr chunk (state=3): >>><<< 44842 1727204490.50718: stdout chunk (state=3): >>><<< 44842 1727204490.50744: done transferring module to remote 44842 1727204490.50756: _low_level_execute_command(): starting 44842 1727204490.50761: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517/ /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517/AnsiballZ_setup.py && sleep 0' 44842 1727204490.52326: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204490.52332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.52404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204490.52492: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204490.52503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.52524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204490.52541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204490.52548: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204490.52556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204490.52568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.52580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.52588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204490.52595: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204490.52605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.52675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204490.52778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204490.52788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204490.53024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204490.54645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204490.54648: stdout chunk (state=3): >>><<< 44842 1727204490.54655: stderr chunk (state=3): >>><<< 44842 1727204490.54672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204490.54676: _low_level_execute_command(): starting 44842 1727204490.54683: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517/AnsiballZ_setup.py && sleep 0' 44842 1727204490.56880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204490.56999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204490.57013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.57030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.57074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204490.57086: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204490.57103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.57119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204490.57129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204490.57139: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204490.57149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204490.57161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204490.57182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204490.57194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204490.57209: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204490.57223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204490.57304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204490.57444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204490.57459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204490.57555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204490.59480: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # <<< 44842 1727204490.59483: stdout chunk (state=3): >>>import '_weakref' # <<< 44842 1727204490.59541: stdout chunk (state=3): >>>import '_io' # <<< 44842 1727204490.59544: stdout chunk (state=3): >>>import 'marshal' # <<< 44842 1727204490.59568: stdout chunk (state=3): >>>import 'posix' # <<< 44842 1727204490.59599: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 44842 1727204490.59651: stdout chunk (state=3): >>>import 'time' # <<< 44842 1727204490.59654: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 44842 1727204490.59730: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.59734: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 44842 1727204490.59757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 44842 1727204490.59776: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686898dc0> <<< 44842 1727204490.59811: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 44842 1727204490.59838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686898b20> <<< 44842 1727204490.59868: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 44842 1727204490.59902: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686898ac0> <<< 44842 1727204490.59942: stdout chunk (state=3): >>>import '_signal' # <<< 44842 1727204490.59980: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d490> <<< 44842 1727204490.59991: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 44842 1727204490.60011: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d940> <<< 44842 1727204490.60022: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d670> <<< 44842 1727204490.60057: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 44842 1727204490.60092: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 44842 1727204490.60103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 44842 1727204490.60134: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 44842 1727204490.60137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 44842 1727204490.60161: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865cf190> <<< 44842 1727204490.60192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 44842 1727204490.60207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 44842 1727204490.60292: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865cf220> <<< 44842 1727204490.60325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 44842 1727204490.60339: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865cf940> <<< 44842 1727204490.60385: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686855880> <<< 44842 1727204490.60398: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865c8d90> <<< 44842 1727204490.60466: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 44842 1727204490.60470: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865f2d90> <<< 44842 1727204490.60521: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d970> <<< 44842 1727204490.60541: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44842 1727204490.60870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 44842 1727204490.60913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 44842 1727204490.60917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 44842 1727204490.60936: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 44842 1727204490.60977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 44842 1727204490.60990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668656df10> <<< 44842 1727204490.61042: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865740a0> <<< 44842 1727204490.61056: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 44842 1727204490.61104: stdout chunk (state=3): >>>import '_sre' # <<< 44842 1727204490.61107: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 44842 1727204490.61144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 44842 1727204490.61148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 44842 1727204490.61172: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865675b0> <<< 44842 1727204490.61207: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668656e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668656d3d0> <<< 44842 1727204490.61214: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 44842 1727204490.61276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 44842 1727204490.61307: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 44842 1727204490.61319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.61346: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 44842 1727204490.61420: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686455e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686455910> <<< 44842 1727204490.61423: stdout chunk (state=3): >>>import 'itertools' # <<< 44842 1727204490.61462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686455f10> <<< 44842 1727204490.61480: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 44842 1727204490.61512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686455fd0> <<< 44842 1727204490.61542: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66864660d0> <<< 44842 1727204490.61545: stdout chunk (state=3): >>>import '_collections' # <<< 44842 1727204490.61593: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686549d90> import '_functools' # <<< 44842 1727204490.61615: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686542670> <<< 44842 1727204490.61696: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 44842 1727204490.61713: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865556d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686575e80> <<< 44842 1727204490.61738: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 44842 1727204490.61750: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686466cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865492b0> <<< 44842 1727204490.61797: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204490.61837: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66865552e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668657ba30> <<< 44842 1727204490.61842: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 44842 1727204490.61871: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 44842 1727204490.61896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466df0> <<< 44842 1727204490.61942: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 44842 1727204490.61977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466d60> <<< 44842 1727204490.61982: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 44842 1727204490.62003: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 44842 1727204490.62016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 44842 1727204490.62061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 44842 1727204490.62108: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66864393d0> <<< 44842 1727204490.62124: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 44842 1727204490.62127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 44842 1727204490.62156: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66864394c0> <<< 44842 1727204490.62281: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668646ef40> <<< 44842 1727204490.62322: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686468a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686468490> <<< 44842 1727204490.62357: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 44842 1727204490.62365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 44842 1727204490.62405: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 44842 1727204490.62421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 44842 1727204490.62447: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668636d220> <<< 44842 1727204490.62480: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686424520> <<< 44842 1727204490.62525: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686468f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668657b0a0> <<< 44842 1727204490.62550: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 44842 1727204490.62574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 44842 1727204490.62613: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668637fb50> import 'errno' # <<< 44842 1727204490.62669: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668637fe80> <<< 44842 1727204490.62693: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 44842 1727204490.62718: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686390790> <<< 44842 1727204490.62737: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 44842 1727204490.62786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 44842 1727204490.62799: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686390cd0> <<< 44842 1727204490.62838: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668632a400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668637ff70> <<< 44842 1727204490.62868: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 44842 1727204490.62879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 44842 1727204490.62924: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668633a2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686390610> <<< 44842 1727204490.62954: stdout chunk (state=3): >>>import 'pwd' # <<< 44842 1727204490.62971: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668633a3a0> <<< 44842 1727204490.62996: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466a30> <<< 44842 1727204490.63029: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 44842 1727204490.63060: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 44842 1727204490.63075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 44842 1727204490.63101: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686356700> <<< 44842 1727204490.63138: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 44842 1727204490.63178: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66863569d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66863567c0> <<< 44842 1727204490.63192: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66863568b0> <<< 44842 1727204490.63217: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 44842 1727204490.63409: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686356d00> <<< 44842 1727204490.63458: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686362250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686356940> <<< 44842 1727204490.63485: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686349a90> <<< 44842 1727204490.63508: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 44842 1727204490.63578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 44842 1727204490.63601: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686356af0> <<< 44842 1727204490.63755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 44842 1727204490.63758: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f668628b6d0> <<< 44842 1727204490.64052: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 44842 1727204490.64145: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.64182: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 44842 1727204490.64214: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44842 1727204490.64233: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 44842 1727204490.64236: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.65431: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.66349: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9820> <<< 44842 1727204490.66392: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.66423: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 44842 1727204490.66436: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 44842 1727204490.66477: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861c9160> <<< 44842 1727204490.66498: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9280> <<< 44842 1727204490.66544: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9f70> <<< 44842 1727204490.66559: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 44842 1727204490.66608: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c94f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9d90> import 'atexit' # <<< 44842 1727204490.66648: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861c9fd0> <<< 44842 1727204490.66662: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 44842 1727204490.66686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 44842 1727204490.66731: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9100> <<< 44842 1727204490.66752: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 44842 1727204490.66786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 44842 1727204490.66812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 44842 1727204490.66827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 44842 1727204490.66903: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686188f40> <<< 44842 1727204490.66962: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685b88340> <<< 44842 1727204490.66987: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685b88040> <<< 44842 1727204490.67009: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 44842 1727204490.67047: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685b88ca0> <<< 44842 1727204490.67062: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861b1dc0> <<< 44842 1727204490.67217: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861b13a0> <<< 44842 1727204490.67254: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 44842 1727204490.67275: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861b1fd0> <<< 44842 1727204490.67303: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 44842 1727204490.67320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 44842 1727204490.67341: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 44842 1727204490.67381: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 44842 1727204490.67403: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861fed30> <<< 44842 1727204490.67486: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861d0d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861d0400> <<< 44842 1727204490.67504: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668617cb20> <<< 44842 1727204490.67525: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861d0520> <<< 44842 1727204490.67550: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861d0550> <<< 44842 1727204490.67585: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 44842 1727204490.67598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 44842 1727204490.67635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 44842 1727204490.67698: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685bf6fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686210250> <<< 44842 1727204490.67737: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 44842 1727204490.67740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 44842 1727204490.67797: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685bf3850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66862103d0> <<< 44842 1727204490.67819: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 44842 1727204490.67861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.67887: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 44842 1727204490.67947: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686210ca0> <<< 44842 1727204490.68082: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685bf37f0> <<< 44842 1727204490.68168: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861a9c10> <<< 44842 1727204490.68206: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686210fa0> <<< 44842 1727204490.68249: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686210550> <<< 44842 1727204490.68252: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686209910> <<< 44842 1727204490.68287: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 44842 1727204490.68305: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 44842 1727204490.68316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 44842 1727204490.68356: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685be8940> <<< 44842 1727204490.68549: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686120d90> <<< 44842 1727204490.68553: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685bf2580> <<< 44842 1727204490.68613: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685be8ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685bf29a0> <<< 44842 1727204490.68636: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 44842 1727204490.68708: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.68804: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.68838: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 44842 1727204490.68858: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 44842 1727204490.68861: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.68944: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.69036: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.69486: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.69956: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 44842 1727204490.69968: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 44842 1727204490.69998: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 44842 1727204490.70001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.70045: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861497f0> <<< 44842 1727204490.70135: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 44842 1727204490.70138: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668614e8b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685770970> <<< 44842 1727204490.70196: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 44842 1727204490.70223: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.70236: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 44842 1727204490.70355: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.70486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 44842 1727204490.70523: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686187730> <<< 44842 1727204490.70526: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.70906: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.71284: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.71331: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.71400: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 44842 1727204490.71441: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.71472: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 44842 1727204490.71485: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.71536: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.71626: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 44842 1727204490.71652: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 44842 1727204490.71688: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.71728: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 44842 1727204490.71917: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.72107: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 44842 1727204490.72139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 44842 1727204490.72220: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861cc370> <<< 44842 1727204490.72223: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.72287: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.72354: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 44842 1727204490.72380: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 44842 1727204490.72433: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.72462: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 44842 1727204490.72509: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.72551: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.72641: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.72698: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 44842 1727204490.72723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.72882: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668613c550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66855eceb0> <<< 44842 1727204490.72924: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 44842 1727204490.72982: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73043: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73056: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73112: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 44842 1727204490.73115: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 44842 1727204490.73140: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 44842 1727204490.73161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 44842 1727204490.73187: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 44842 1727204490.73206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 44842 1727204490.73286: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861437f0> <<< 44842 1727204490.73326: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686142790> <<< 44842 1727204490.73395: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668613cb50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 44842 1727204490.73398: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73433: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73447: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 44842 1727204490.73524: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 44842 1727204490.73552: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 44842 1727204490.73565: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73617: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73680: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73708: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73711: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73748: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73782: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73816: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73847: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 44842 1727204490.73859: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73923: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.73998: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.74011: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.74045: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 44842 1727204490.74200: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.74340: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.74377: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.74422: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.74453: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 44842 1727204490.74490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 44842 1727204490.74493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 44842 1727204490.74513: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685736370> <<< 44842 1727204490.74539: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 44842 1727204490.74569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 44842 1727204490.74590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 44842 1727204490.74626: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 44842 1727204490.74642: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685753a90> <<< 44842 1727204490.74683: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685753b20> <<< 44842 1727204490.74744: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685726280> <<< 44842 1727204490.74758: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685736970> <<< 44842 1727204490.74789: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854f17f0> <<< 44842 1727204490.74808: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854f1b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 44842 1727204490.74832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 44842 1727204490.74857: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 44842 1727204490.74900: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66857990a0> <<< 44842 1727204490.74925: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685733f70> <<< 44842 1727204490.74939: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 44842 1727204490.74974: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685799190> <<< 44842 1727204490.74995: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 44842 1727204490.75024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 44842 1727204490.75037: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685559fd0> <<< 44842 1727204490.75059: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685782820> <<< 44842 1727204490.75102: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854f1d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 44842 1727204490.75139: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 44842 1727204490.75152: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75202: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75252: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 44842 1727204490.75302: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75363: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 44842 1727204490.75385: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 44842 1727204490.75416: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75444: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 44842 1727204490.75460: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75490: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75538: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 44842 1727204490.75584: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75622: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 44842 1727204490.75681: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75731: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75779: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.75834: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 44842 1727204490.75847: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76219: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76587: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 44842 1727204490.76633: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76685: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76709: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76758: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 44842 1727204490.76777: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76789: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76816: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 44842 1727204490.76871: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76925: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 44842 1727204490.76928: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76947: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.76983: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 44842 1727204490.77014: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77042: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 44842 1727204490.77055: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77114: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 44842 1727204490.77221: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685439e80> <<< 44842 1727204490.77235: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 44842 1727204490.77266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 44842 1727204490.77421: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854399d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 44842 1727204490.77425: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77485: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77549: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 44842 1727204490.77553: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77621: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77712: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 44842 1727204490.77776: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77846: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 44842 1727204490.77849: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77881: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.77932: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 44842 1727204490.77944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 44842 1727204490.78089: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66854b5550> <<< 44842 1727204490.78333: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685468850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 44842 1727204490.78336: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78385: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78439: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 44842 1727204490.78443: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78506: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78583: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78678: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78819: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 44842 1727204490.78822: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78853: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78904: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 44842 1727204490.78907: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78938: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.78987: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 44842 1727204490.79040: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204490.79068: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66854b4670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854b4220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available <<< 44842 1727204490.79090: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 44842 1727204490.79127: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79176: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 44842 1727204490.79179: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79296: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79427: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 44842 1727204490.79517: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79601: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79634: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79696: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 44842 1727204490.79699: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79763: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79788: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.79899: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.80023: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 44842 1727204490.80037: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.80134: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.80244: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 44842 1727204490.80284: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.80310: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.80749: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81177: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 44842 1727204490.81181: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81262: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81359: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 44842 1727204490.81362: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81440: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81530: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 44842 1727204490.81661: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81795: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 44842 1727204490.81836: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 44842 1727204490.81839: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81879: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.81915: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 44842 1727204490.81999: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82088: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82256: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82430: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 44842 1727204490.82445: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82476: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82509: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 44842 1727204490.82566: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82570: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 44842 1727204490.82582: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82629: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82695: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 44842 1727204490.82727: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82750: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 44842 1727204490.82799: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82855: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 44842 1727204490.82910: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.82960: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 44842 1727204490.82977: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83183: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83403: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 44842 1727204490.83457: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83522: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 44842 1727204490.83525: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83546: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83585: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 44842 1727204490.83622: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83650: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 44842 1727204490.83672: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83685: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83721: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 44842 1727204490.83795: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83869: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 44842 1727204490.83897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 44842 1727204490.83910: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83952: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.83991: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 44842 1727204490.84011: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84034: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84077: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84119: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84181: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84251: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 44842 1727204490.84265: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84308: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84360: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 44842 1727204490.84363: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84517: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84689: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 44842 1727204490.84732: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84787: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 44842 1727204490.84790: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84824: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84872: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 44842 1727204490.84875: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.84944: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.85022: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 44842 1727204490.85025: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.85097: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.85177: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 44842 1727204490.85235: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204490.85412: stdout chunk (state=3): >>>import 'gc' # <<< 44842 1727204490.86242: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 44842 1727204490.86247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 44842 1727204490.86277: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 44842 1727204490.86320: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66853f9160> <<< 44842 1727204490.86323: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686151e50> <<< 44842 1727204490.86386: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668544ed30> <<< 44842 1727204490.88413: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 44842 1727204490.88433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668544ef40> <<< 44842 1727204490.88477: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 44842 1727204490.88499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 44842 1727204490.88513: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685468250> <<< 44842 1727204490.88567: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204490.88598: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685256cd0> <<< 44842 1727204490.88620: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668528ceb0> <<< 44842 1727204490.88949: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 44842 1727204490.88952: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 44842 1727204491.14021: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQ<<< 44842 1727204491.14083: stdout chunk (state=3): >>>WhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2796, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 736, "free": 2796}, "nocache": {"free": 3272, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 753, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271769600, "block_size": 4096, "block_total": 65519355, "block_available": 64519475, "block_used": 999880, "inode_total": 131071472, "inode_available": 130998230, "inode_used": 73242, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatt<<< 44842 1727204491.14112: stdout chunk (state=3): >>>er_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_loadavg": {"1m": 0.52, "5m": 0.45, "15m": 0.29}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "31", "epoch": "1727204491", "epoch_int": "1727204491", "date": "2024-09-24", "time": "15:01:31", "iso8601_micro": "2024-09-24T19:01:31.136083Z", "iso8601": "2024-09-24T19:01:31Z", "iso8601_basic": "20240924T150131136083", "iso8601_basic_short": "20240924T150131", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44842 1727204491.14632: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 44842 1727204491.14783: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 44842 1727204491.14851: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 44842 1727204491.14866: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 44842 1727204491.14922: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context <<< 44842 1727204491.14946: stdout chunk (state=3): >>># cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl <<< 44842 1727204491.14970: stdout chunk (state=3): >>># cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 44842 1727204491.15221: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 44842 1727204491.15247: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 44842 1727204491.15302: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 44842 1727204491.15327: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 44842 1727204491.15341: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 44842 1727204491.15367: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 44842 1727204491.15410: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 44842 1727204491.15453: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 44842 1727204491.15496: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 44842 1727204491.15536: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 <<< 44842 1727204491.15587: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 44842 1727204491.15603: stdout chunk (state=3): >>># destroy json <<< 44842 1727204491.15620: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 44842 1727204491.15893: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 44842 1727204491.16006: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 44842 1727204491.16009: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 44842 1727204491.16174: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 44842 1727204491.16198: stdout chunk (state=3): >>># destroy tokenize <<< 44842 1727204491.16226: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 44842 1727204491.16249: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 44842 1727204491.16285: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 44842 1727204491.16578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204491.16682: stderr chunk (state=3): >>>Shared connection to 10.31.9.148 closed. <<< 44842 1727204491.16685: stdout chunk (state=3): >>><<< 44842 1727204491.16688: stderr chunk (state=3): >>><<< 44842 1727204491.16949: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686898dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686898b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686898ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686855880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668683d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668656df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668656e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668656d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686455e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686455910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686455f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686455fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66864660d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686549d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686542670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865556d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686575e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686466cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66865492b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66865552e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668657ba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66864393d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66864394c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668646ef40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686468a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686468490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668636d220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686424520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686468f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668657b0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668637fb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668637fe80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686390790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686390cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668632a400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668637ff70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668633a2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686390610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668633a3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686356700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66863569d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66863567c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66863568b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686356d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686362250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686356940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686349a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686466610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686356af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f668628b6d0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861c9160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c94f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861c9fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861c9100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686188f40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685b88340> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685b88040> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685b88ca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861b1dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861b13a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861b1fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861fed30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861d0d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861d0400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668617cb20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861d0520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861d0550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685bf6fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686210250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685bf3850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66862103d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686210ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685bf37f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861a9c10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686210fa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686210550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686209910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685be8940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6686120d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685bf2580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685be8ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685bf29a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66861497f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668614e8b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685770970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686187730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861cc370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f668613c550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66855eceb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66861437f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686142790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668613cb50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685736370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685753a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685753b20> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685726280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685736970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854f17f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854f1b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66857990a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685733f70> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685799190> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6685559fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685782820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854f1d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685439e80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854399d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66854b5550> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685468850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66854b4670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66854b4220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dj1mq7x3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66853f9160> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6686151e50> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668544ed30> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668544ef40> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685468250> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6685256cd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f668528ceb0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2796, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 736, "free": 2796}, "nocache": {"free": 3272, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 753, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271769600, "block_size": 4096, "block_total": 65519355, "block_available": 64519475, "block_used": 999880, "inode_total": 131071472, "inode_available": 130998230, "inode_used": 73242, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_loadavg": {"1m": 0.52, "5m": 0.45, "15m": 0.29}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "31", "epoch": "1727204491", "epoch_int": "1727204491", "date": "2024-09-24", "time": "15:01:31", "iso8601_micro": "2024-09-24T19:01:31.136083Z", "iso8601": "2024-09-24T19:01:31Z", "iso8601_basic": "20240924T150131136083", "iso8601_basic_short": "20240924T150131", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 44842 1727204491.19058: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204491.19061: _low_level_execute_command(): starting 44842 1727204491.19066: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204489.9197688-44893-179572236939517/ > /dev/null 2>&1 && sleep 0' 44842 1727204491.20719: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204491.20739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.20742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.20762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.20794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.20801: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204491.20811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.20825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204491.20832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204491.20839: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204491.20847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.20856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.20881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.20884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.20887: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204491.20911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.20978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204491.20982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204491.20988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204491.21361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204491.23213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204491.23217: stdout chunk (state=3): >>><<< 44842 1727204491.23224: stderr chunk (state=3): >>><<< 44842 1727204491.23662: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204491.23676: handler run complete 44842 1727204491.23815: variable 'ansible_facts' from source: unknown 44842 1727204491.24478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204491.25643: variable 'ansible_facts' from source: unknown 44842 1727204491.25646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204491.26149: attempt loop complete, returning result 44842 1727204491.26153: _execute() done 44842 1727204491.26155: dumping result to json 44842 1727204491.26199: done dumping result, returning 44842 1727204491.26207: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-aad0-d242-0000000000af] 44842 1727204491.26212: sending task result for task 0affcd87-79f5-aad0-d242-0000000000af 44842 1727204491.28616: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000af 44842 1727204491.28620: WORKER PROCESS EXITING ok: [managed-node1] 44842 1727204491.28916: no more pending results, returning what we have 44842 1727204491.28919: results queue empty 44842 1727204491.28920: checking for any_errors_fatal 44842 1727204491.28921: done checking for any_errors_fatal 44842 1727204491.28922: checking for max_fail_percentage 44842 1727204491.28923: done checking for max_fail_percentage 44842 1727204491.28924: checking to see if all hosts have failed and the running result is not ok 44842 1727204491.28925: done checking to see if all hosts have failed 44842 1727204491.28926: getting the remaining hosts for this loop 44842 1727204491.28928: done getting the remaining hosts for this loop 44842 1727204491.28932: getting the next task for host managed-node1 44842 1727204491.28939: done getting next task for host managed-node1 44842 1727204491.28941: ^ task is: TASK: meta (flush_handlers) 44842 1727204491.28943: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204491.28947: getting variables 44842 1727204491.28949: in VariableManager get_vars() 44842 1727204491.28981: Calling all_inventory to load vars for managed-node1 44842 1727204491.28984: Calling groups_inventory to load vars for managed-node1 44842 1727204491.28987: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204491.28997: Calling all_plugins_play to load vars for managed-node1 44842 1727204491.29001: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204491.29004: Calling groups_plugins_play to load vars for managed-node1 44842 1727204491.29203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204491.29773: done with get_vars() 44842 1727204491.29785: done getting variables 44842 1727204491.30080: in VariableManager get_vars() 44842 1727204491.30090: Calling all_inventory to load vars for managed-node1 44842 1727204491.30092: Calling groups_inventory to load vars for managed-node1 44842 1727204491.30094: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204491.30099: Calling all_plugins_play to load vars for managed-node1 44842 1727204491.30101: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204491.30109: Calling groups_plugins_play to load vars for managed-node1 44842 1727204491.30380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204491.30812: done with get_vars() 44842 1727204491.30942: done queuing things up, now waiting for results queue to drain 44842 1727204491.30944: results queue empty 44842 1727204491.30945: checking for any_errors_fatal 44842 1727204491.30948: done checking for any_errors_fatal 44842 1727204491.30949: checking for max_fail_percentage 44842 1727204491.30950: done checking for max_fail_percentage 44842 1727204491.30950: checking to see if all hosts have failed and the running result is not ok 44842 1727204491.30951: done checking to see if all hosts have failed 44842 1727204491.30952: getting the remaining hosts for this loop 44842 1727204491.30953: done getting the remaining hosts for this loop 44842 1727204491.30955: getting the next task for host managed-node1 44842 1727204491.30960: done getting next task for host managed-node1 44842 1727204491.30967: ^ task is: TASK: Include the task 'el_repo_setup.yml' 44842 1727204491.30969: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204491.30971: getting variables 44842 1727204491.30972: in VariableManager get_vars() 44842 1727204491.30981: Calling all_inventory to load vars for managed-node1 44842 1727204491.30983: Calling groups_inventory to load vars for managed-node1 44842 1727204491.30985: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204491.30989: Calling all_plugins_play to load vars for managed-node1 44842 1727204491.30991: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204491.30994: Calling groups_plugins_play to load vars for managed-node1 44842 1727204491.31606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204491.32069: done with get_vars() 44842 1727204491.32077: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:11 Tuesday 24 September 2024 15:01:31 -0400 (0:00:01.458) 0:00:01.489 ***** 44842 1727204491.32149: entering _queue_task() for managed-node1/include_tasks 44842 1727204491.32151: Creating lock for include_tasks 44842 1727204491.32808: worker is 1 (out of 1 available) 44842 1727204491.33072: exiting _queue_task() for managed-node1/include_tasks 44842 1727204491.33084: done queuing things up, now waiting for results queue to drain 44842 1727204491.33086: waiting for pending results... 44842 1727204491.33569: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 44842 1727204491.33676: in run() - task 0affcd87-79f5-aad0-d242-000000000006 44842 1727204491.33695: variable 'ansible_search_path' from source: unknown 44842 1727204491.33735: calling self._execute() 44842 1727204491.33818: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204491.33828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204491.33840: variable 'omit' from source: magic vars 44842 1727204491.33948: _execute() done 44842 1727204491.33957: dumping result to json 44842 1727204491.33972: done dumping result, returning 44842 1727204491.33984: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-aad0-d242-000000000006] 44842 1727204491.33995: sending task result for task 0affcd87-79f5-aad0-d242-000000000006 44842 1727204491.34137: no more pending results, returning what we have 44842 1727204491.34143: in VariableManager get_vars() 44842 1727204491.34179: Calling all_inventory to load vars for managed-node1 44842 1727204491.34182: Calling groups_inventory to load vars for managed-node1 44842 1727204491.34186: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204491.34200: Calling all_plugins_play to load vars for managed-node1 44842 1727204491.34203: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204491.34206: Calling groups_plugins_play to load vars for managed-node1 44842 1727204491.34407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204491.34622: done with get_vars() 44842 1727204491.34630: variable 'ansible_search_path' from source: unknown 44842 1727204491.34645: we have included files to process 44842 1727204491.34646: generating all_blocks data 44842 1727204491.34647: done generating all_blocks data 44842 1727204491.34648: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44842 1727204491.34649: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44842 1727204491.34652: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44842 1727204491.35157: done sending task result for task 0affcd87-79f5-aad0-d242-000000000006 44842 1727204491.35160: WORKER PROCESS EXITING 44842 1727204491.35556: in VariableManager get_vars() 44842 1727204491.35573: done with get_vars() 44842 1727204491.35583: done processing included file 44842 1727204491.35585: iterating over new_blocks loaded from include file 44842 1727204491.35586: in VariableManager get_vars() 44842 1727204491.35595: done with get_vars() 44842 1727204491.35596: filtering new block on tags 44842 1727204491.35611: done filtering new block on tags 44842 1727204491.35614: in VariableManager get_vars() 44842 1727204491.35623: done with get_vars() 44842 1727204491.35624: filtering new block on tags 44842 1727204491.35638: done filtering new block on tags 44842 1727204491.35642: in VariableManager get_vars() 44842 1727204491.35651: done with get_vars() 44842 1727204491.35652: filtering new block on tags 44842 1727204491.35671: done filtering new block on tags 44842 1727204491.35673: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 44842 1727204491.35678: extending task lists for all hosts with included blocks 44842 1727204491.35723: done extending task lists 44842 1727204491.35724: done processing included files 44842 1727204491.35724: results queue empty 44842 1727204491.35725: checking for any_errors_fatal 44842 1727204491.35726: done checking for any_errors_fatal 44842 1727204491.35727: checking for max_fail_percentage 44842 1727204491.35728: done checking for max_fail_percentage 44842 1727204491.35728: checking to see if all hosts have failed and the running result is not ok 44842 1727204491.35729: done checking to see if all hosts have failed 44842 1727204491.35730: getting the remaining hosts for this loop 44842 1727204491.35731: done getting the remaining hosts for this loop 44842 1727204491.35733: getting the next task for host managed-node1 44842 1727204491.35736: done getting next task for host managed-node1 44842 1727204491.35738: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 44842 1727204491.35741: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204491.35743: getting variables 44842 1727204491.35744: in VariableManager get_vars() 44842 1727204491.35752: Calling all_inventory to load vars for managed-node1 44842 1727204491.35754: Calling groups_inventory to load vars for managed-node1 44842 1727204491.35756: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204491.35765: Calling all_plugins_play to load vars for managed-node1 44842 1727204491.35768: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204491.35771: Calling groups_plugins_play to load vars for managed-node1 44842 1727204491.35937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204491.36145: done with get_vars() 44842 1727204491.36153: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:01:31 -0400 (0:00:00.040) 0:00:01.530 ***** 44842 1727204491.36220: entering _queue_task() for managed-node1/setup 44842 1727204491.37251: worker is 1 (out of 1 available) 44842 1727204491.37267: exiting _queue_task() for managed-node1/setup 44842 1727204491.37280: done queuing things up, now waiting for results queue to drain 44842 1727204491.37282: waiting for pending results... 44842 1727204491.38178: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 44842 1727204491.38434: in run() - task 0affcd87-79f5-aad0-d242-0000000000c0 44842 1727204491.38460: variable 'ansible_search_path' from source: unknown 44842 1727204491.38548: variable 'ansible_search_path' from source: unknown 44842 1727204491.38595: calling self._execute() 44842 1727204491.38784: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204491.38794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204491.38807: variable 'omit' from source: magic vars 44842 1727204491.39959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204491.43280: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204491.43390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204491.43431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204491.43487: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204491.43521: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204491.43609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204491.43644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204491.43678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204491.43732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204491.43753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204491.43948: variable 'ansible_facts' from source: unknown 44842 1727204491.44104: variable 'network_test_required_facts' from source: task vars 44842 1727204491.44183: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 44842 1727204491.44255: variable 'omit' from source: magic vars 44842 1727204491.44299: variable 'omit' from source: magic vars 44842 1727204491.44497: variable 'omit' from source: magic vars 44842 1727204491.44525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204491.44557: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204491.44589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204491.44704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204491.44719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204491.44752: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204491.44756: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204491.44758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204491.44969: Set connection var ansible_shell_type to sh 44842 1727204491.44987: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204491.44996: Set connection var ansible_connection to ssh 44842 1727204491.45105: Set connection var ansible_pipelining to False 44842 1727204491.45127: Set connection var ansible_timeout to 10 44842 1727204491.45140: Set connection var ansible_shell_executable to /bin/sh 44842 1727204491.45170: variable 'ansible_shell_executable' from source: unknown 44842 1727204491.45179: variable 'ansible_connection' from source: unknown 44842 1727204491.45186: variable 'ansible_module_compression' from source: unknown 44842 1727204491.45194: variable 'ansible_shell_type' from source: unknown 44842 1727204491.45200: variable 'ansible_shell_executable' from source: unknown 44842 1727204491.45207: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204491.45214: variable 'ansible_pipelining' from source: unknown 44842 1727204491.45238: variable 'ansible_timeout' from source: unknown 44842 1727204491.45248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204491.45627: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204491.45649: variable 'omit' from source: magic vars 44842 1727204491.45665: starting attempt loop 44842 1727204491.45677: running the handler 44842 1727204491.45694: _low_level_execute_command(): starting 44842 1727204491.45705: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204491.46445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204491.46463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.46481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.46500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.46547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.46568: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204491.46585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.46605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204491.46618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204491.46634: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204491.46648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.46662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.46683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.46695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.46707: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204491.46721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.46803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204491.46826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204491.46846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204491.46943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204491.48502: stdout chunk (state=3): >>>/root <<< 44842 1727204491.48685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204491.48688: stdout chunk (state=3): >>><<< 44842 1727204491.48691: stderr chunk (state=3): >>><<< 44842 1727204491.48771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204491.48775: _low_level_execute_command(): starting 44842 1727204491.48778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778 `" && echo ansible-tmp-1727204491.4871-45004-2051659729778="` echo /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778 `" ) && sleep 0' 44842 1727204491.50188: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204491.50201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.50213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.50229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.50272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.50284: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204491.50299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.50314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204491.50324: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204491.50332: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204491.50342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.50354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.50373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.50385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.50396: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204491.50409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.50488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204491.50512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204491.50525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204491.50603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204491.52449: stdout chunk (state=3): >>>ansible-tmp-1727204491.4871-45004-2051659729778=/root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778 <<< 44842 1727204491.52659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204491.52671: stdout chunk (state=3): >>><<< 44842 1727204491.52675: stderr chunk (state=3): >>><<< 44842 1727204491.52781: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204491.4871-45004-2051659729778=/root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204491.52785: variable 'ansible_module_compression' from source: unknown 44842 1727204491.52890: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44842 1727204491.52999: variable 'ansible_facts' from source: unknown 44842 1727204491.53103: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778/AnsiballZ_setup.py 44842 1727204491.53287: Sending initial data 44842 1727204491.53290: Sent initial data (149 bytes) 44842 1727204491.55125: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204491.55139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.55153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.55177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.55229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.55245: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204491.55265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.55286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204491.55311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204491.55324: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204491.55336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.55349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.55371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.55384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.55394: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204491.55419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.55494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204491.55517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204491.55544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204491.55642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204491.57329: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204491.57390: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204491.57435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpl_2roqfm /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778/AnsiballZ_setup.py <<< 44842 1727204491.57497: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204491.60604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204491.60767: stderr chunk (state=3): >>><<< 44842 1727204491.60791: stdout chunk (state=3): >>><<< 44842 1727204491.60875: done transferring module to remote 44842 1727204491.60878: _low_level_execute_command(): starting 44842 1727204491.60881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778/ /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778/AnsiballZ_setup.py && sleep 0' 44842 1727204491.61573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204491.61588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.61601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.61626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.61672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.61685: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204491.61699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.61717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204491.61736: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204491.61748: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204491.61759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.61779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.61795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.61807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.61818: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204491.61831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.61917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204491.61939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204491.61970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204491.62052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204491.63751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204491.63841: stderr chunk (state=3): >>><<< 44842 1727204491.63854: stdout chunk (state=3): >>><<< 44842 1727204491.63967: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204491.63970: _low_level_execute_command(): starting 44842 1727204491.63973: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778/AnsiballZ_setup.py && sleep 0' 44842 1727204491.65026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.65029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.65070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204491.65074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.65076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.65138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204491.65797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204491.65853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204491.67784: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 44842 1727204491.67799: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 44842 1727204491.67862: stdout chunk (state=3): >>>import '_io' # <<< 44842 1727204491.67872: stdout chunk (state=3): >>>import 'marshal' # <<< 44842 1727204491.67897: stdout chunk (state=3): >>>import 'posix' # <<< 44842 1727204491.67927: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 44842 1727204491.67979: stdout chunk (state=3): >>>import 'time' # <<< 44842 1727204491.67982: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 44842 1727204491.68034: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 44842 1727204491.68038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.68056: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 44842 1727204491.68077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 44842 1727204491.68080: stdout chunk (state=3): >>>import '_codecs' # <<< 44842 1727204491.68106: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37898dc0> <<< 44842 1727204491.68160: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 44842 1727204491.68167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 44842 1727204491.68171: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d3a0> <<< 44842 1727204491.68175: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37898b20> <<< 44842 1727204491.68189: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 44842 1727204491.68201: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37898ac0> <<< 44842 1727204491.68231: stdout chunk (state=3): >>>import '_signal' # <<< 44842 1727204491.68247: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 44842 1727204491.68261: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d490> <<< 44842 1727204491.68291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 44842 1727204491.68307: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 44842 1727204491.68326: stdout chunk (state=3): >>>import '_abc' # <<< 44842 1727204491.68346: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d940> <<< 44842 1727204491.68349: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d670> <<< 44842 1727204491.68391: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 44842 1727204491.68394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 44842 1727204491.68436: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 44842 1727204491.68440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 44842 1727204491.68456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 44842 1727204491.68469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 44842 1727204491.68498: stdout chunk (state=3): >>>import '_stat' # <<< 44842 1727204491.68518: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375cf190> <<< 44842 1727204491.68524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 44842 1727204491.68537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 44842 1727204491.68617: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375cf220> <<< 44842 1727204491.68634: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 44842 1727204491.68653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 44842 1727204491.68680: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 44842 1727204491.68684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375cf940> <<< 44842 1727204491.68746: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37855880> <<< 44842 1727204491.68750: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 44842 1727204491.68752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 44842 1727204491.68754: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375c8d90> <<< 44842 1727204491.68813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 44842 1727204491.68816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 44842 1727204491.68819: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375f2d90> <<< 44842 1727204491.68873: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d970> <<< 44842 1727204491.68903: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44842 1727204491.69228: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 44842 1727204491.69253: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 44842 1727204491.69278: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 44842 1727204491.69285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 44842 1727204491.69301: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 44842 1727204491.69329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 44842 1727204491.69333: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 44842 1727204491.69351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 44842 1727204491.69354: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3756ef10> <<< 44842 1727204491.69415: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375740a0> <<< 44842 1727204491.69420: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 44842 1727204491.69435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 44842 1727204491.69446: stdout chunk (state=3): >>>import '_sre' # <<< 44842 1727204491.69484: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 44842 1727204491.69490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 44842 1727204491.69503: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 44842 1727204491.69530: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375675b0> <<< 44842 1727204491.69553: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3756f6a0> <<< 44842 1727204491.69558: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3756e3d0> <<< 44842 1727204491.69576: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 44842 1727204491.69648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 44842 1727204491.69666: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 44842 1727204491.69700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.69723: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 44842 1727204491.69726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 44842 1727204491.69760: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.69766: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37455e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37455970> <<< 44842 1727204491.69783: stdout chunk (state=3): >>>import 'itertools' # <<< 44842 1727204491.69803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 44842 1727204491.69808: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37455f70> <<< 44842 1727204491.69842: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 44842 1727204491.69848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 44842 1727204491.69862: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37455dc0> <<< 44842 1727204491.69900: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 44842 1727204491.69905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 44842 1727204491.69907: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465130> <<< 44842 1727204491.69919: stdout chunk (state=3): >>>import '_collections' # <<< 44842 1727204491.69967: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37549df0> <<< 44842 1727204491.69970: stdout chunk (state=3): >>>import '_functools' # <<< 44842 1727204491.69989: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375426d0> <<< 44842 1727204491.70054: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37555730> <<< 44842 1727204491.70060: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37575e80> <<< 44842 1727204491.70083: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 44842 1727204491.70116: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.70120: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37465d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37549310> <<< 44842 1727204491.70161: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.70166: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37555340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3757ba30> <<< 44842 1727204491.70194: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 44842 1727204491.70231: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 44842 1727204491.70236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.70258: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 44842 1727204491.70266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 44842 1727204491.70286: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465e50> <<< 44842 1727204491.70311: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 44842 1727204491.70314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465dc0> <<< 44842 1727204491.70338: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 44842 1727204491.70341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 44842 1727204491.70355: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 44842 1727204491.70368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 44842 1727204491.70392: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 44842 1727204491.70434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 44842 1727204491.70468: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37439430> <<< 44842 1727204491.70485: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 44842 1727204491.70497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 44842 1727204491.70532: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37439520> <<< 44842 1727204491.70647: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3746dfa0> <<< 44842 1727204491.70694: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37468af0> <<< 44842 1727204491.70697: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc374684c0> <<< 44842 1727204491.70726: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 44842 1727204491.70775: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 44842 1727204491.70778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 44842 1727204491.70813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 44842 1727204491.70816: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3736d280> <<< 44842 1727204491.70848: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37424dc0> <<< 44842 1727204491.70901: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37468f70> <<< 44842 1727204491.70904: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3757b0a0> <<< 44842 1727204491.70931: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 44842 1727204491.70945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 44842 1727204491.70982: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3737ebb0> <<< 44842 1727204491.70994: stdout chunk (state=3): >>>import 'errno' # <<< 44842 1727204491.71028: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71035: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc3737eee0> <<< 44842 1727204491.71056: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 44842 1727204491.71062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 44842 1727204491.71078: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 44842 1727204491.71089: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc373907f0> <<< 44842 1727204491.71116: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 44842 1727204491.71139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 44842 1727204491.71175: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37390d30> <<< 44842 1727204491.71217: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71220: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37329460> <<< 44842 1727204491.71224: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3737efd0> <<< 44842 1727204491.71248: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 44842 1727204491.71251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 44842 1727204491.71310: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71313: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37339340> <<< 44842 1727204491.71317: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37390670> <<< 44842 1727204491.71319: stdout chunk (state=3): >>>import 'pwd' # <<< 44842 1727204491.71333: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71347: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37339400> <<< 44842 1727204491.71377: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465a90> <<< 44842 1727204491.71399: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 44842 1727204491.71410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 44842 1727204491.71440: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 44842 1727204491.71449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 44842 1727204491.71488: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355760> <<< 44842 1727204491.71502: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 44842 1727204491.71534: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71548: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37355820> <<< 44842 1727204491.71569: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71585: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355910> <<< 44842 1727204491.71599: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 44842 1727204491.71792: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71795: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355d60> <<< 44842 1727204491.71835: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.71839: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc373602b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc373559a0> <<< 44842 1727204491.71855: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37349af0> <<< 44842 1727204491.71877: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465670> <<< 44842 1727204491.71903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 44842 1727204491.71955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 44842 1727204491.71993: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37355b50> <<< 44842 1727204491.72130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 44842 1727204491.72143: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7efc3728a730> <<< 44842 1727204491.72415: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip' # zipimport: zlib available <<< 44842 1727204491.72508: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.72542: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 44842 1727204491.72558: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.72571: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 44842 1727204491.72592: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.73783: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.74704: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7880> <<< 44842 1727204491.74730: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.74755: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 44842 1727204491.74787: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 44842 1727204491.74819: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc371c7160> <<< 44842 1727204491.74852: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7280> <<< 44842 1727204491.74886: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7fd0> <<< 44842 1727204491.74910: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 44842 1727204491.74958: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c74f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7df0> <<< 44842 1727204491.74966: stdout chunk (state=3): >>>import 'atexit' # <<< 44842 1727204491.75001: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc371c7580> <<< 44842 1727204491.75008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 44842 1727204491.75039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 44842 1727204491.75075: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7100> <<< 44842 1727204491.75095: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 44842 1727204491.75102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 44842 1727204491.75140: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 44842 1727204491.75146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 44842 1727204491.75172: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 44842 1727204491.75248: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3715c070> <<< 44842 1727204491.75286: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36b893a0> <<< 44842 1727204491.75314: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36b890a0> <<< 44842 1727204491.75335: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 44842 1727204491.75350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 44842 1727204491.75391: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36b89d00> <<< 44842 1727204491.75398: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371afdc0> <<< 44842 1727204491.75555: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371af3a0> <<< 44842 1727204491.75586: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 44842 1727204491.75613: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371aff40> <<< 44842 1727204491.75622: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 44842 1727204491.75637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 44842 1727204491.75671: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 44842 1727204491.75685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 44842 1727204491.75705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 44842 1727204491.75732: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371fee80> <<< 44842 1727204491.75814: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37185d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37185460> <<< 44842 1727204491.75823: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c4ac0> <<< 44842 1727204491.75847: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37185580> <<< 44842 1727204491.75879: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371855b0> <<< 44842 1727204491.75897: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 44842 1727204491.75913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 44842 1727204491.75927: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 44842 1727204491.75967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 44842 1727204491.76029: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36bf4f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc372112b0> <<< 44842 1727204491.76066: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 44842 1727204491.76073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 44842 1727204491.76123: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36bf17f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37211430> <<< 44842 1727204491.76152: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 44842 1727204491.76188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.76212: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 44842 1727204491.76218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 44842 1727204491.76281: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37211c40> <<< 44842 1727204491.76405: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36bf1790> <<< 44842 1727204491.76499: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37211100> <<< 44842 1727204491.76529: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc372115b0> <<< 44842 1727204491.76574: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37211f70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37209970> <<< 44842 1727204491.76603: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 44842 1727204491.76627: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 44842 1727204491.76633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 44842 1727204491.76680: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36be78e0> <<< 44842 1727204491.76852: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37120df0> <<< 44842 1727204491.76869: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36bf0520> <<< 44842 1727204491.76899: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.76922: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36be7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36bf0940> # zipimport: zlib available <<< 44842 1727204491.76937: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 44842 1727204491.76952: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.77026: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.77107: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44842 1727204491.77117: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 44842 1727204491.77143: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44842 1727204491.77153: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 44842 1727204491.77169: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.77257: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.77358: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.77796: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.78268: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 44842 1727204491.78272: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 44842 1727204491.78302: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.78365: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204491.78369: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc3711b790> <<< 44842 1727204491.78445: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 44842 1727204491.78451: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3715a850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36791fd0> <<< 44842 1727204491.78509: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 44842 1727204491.78536: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.78552: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 44842 1727204491.78558: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.78673: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.78803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 44842 1727204491.78834: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3718d2e0> # zipimport: zlib available <<< 44842 1727204491.79231: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.79593: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.79650: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.79716: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 44842 1727204491.79723: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.79753: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.79788: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 44842 1727204491.79850: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.79935: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 44842 1727204491.79952: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 44842 1727204491.79967: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80001: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80043: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 44842 1727204491.80049: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80224: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80414: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 44842 1727204491.80437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 44842 1727204491.80451: stdout chunk (state=3): >>>import '_ast' # <<< 44842 1727204491.80520: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371cdca0> <<< 44842 1727204491.80527: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80587: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80653: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 44842 1727204491.80669: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 44842 1727204491.80683: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80723: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80760: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 44842 1727204491.80767: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80800: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80842: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80930: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.80994: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 44842 1727204491.81014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.81090: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc3713ec40> <<< 44842 1727204491.81175: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371cdbe0> <<< 44842 1727204491.81206: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 44842 1727204491.81213: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81273: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81321: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81348: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81386: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 44842 1727204491.81402: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 44842 1727204491.81416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 44842 1727204491.81458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 44842 1727204491.81476: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 44842 1727204491.81501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 44842 1727204491.81572: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37151910> <<< 44842 1727204491.81613: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3719bb50> <<< 44842 1727204491.81676: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36603820> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 44842 1727204491.81708: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81729: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 44842 1727204491.81811: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 44842 1727204491.81831: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44842 1727204491.81842: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 44842 1727204491.81849: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81903: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81955: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81976: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.81990: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82036: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82073: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82101: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82125: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 44842 1727204491.82142: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82206: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82276: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82288: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82325: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 44842 1727204491.82480: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82612: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82649: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.82696: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204491.82724: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 44842 1727204491.82750: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 44842 1727204491.82789: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc364ee100> <<< 44842 1727204491.82804: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 44842 1727204491.82821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 44842 1727204491.82833: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 44842 1727204491.82867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 44842 1727204491.82889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 44842 1727204491.82909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36752a90> <<< 44842 1727204491.82952: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36752a00> <<< 44842 1727204491.83025: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36725dc0> <<< 44842 1727204491.83031: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36725790> <<< 44842 1727204491.83059: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367704c0> <<< 44842 1727204491.83072: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36770d60> <<< 44842 1727204491.83088: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 44842 1727204491.83100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 44842 1727204491.83128: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 44842 1727204491.83162: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36735ee0> <<< 44842 1727204491.83182: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367359d0> <<< 44842 1727204491.83195: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 44842 1727204491.83210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 44842 1727204491.83231: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367351f0> <<< 44842 1727204491.83256: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 44842 1727204491.83276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 44842 1727204491.83307: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36550280> <<< 44842 1727204491.83337: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37219a30> <<< 44842 1727204491.83359: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36770070> <<< 44842 1727204491.83372: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 44842 1727204491.83405: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 44842 1727204491.83417: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83477: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83525: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 44842 1727204491.83531: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83576: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83633: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 44842 1727204491.83644: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 44842 1727204491.83659: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83687: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83719: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 44842 1727204491.83773: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83813: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 44842 1727204491.83820: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83853: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.83897: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 44842 1727204491.83957: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.84005: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.84060: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.84112: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 44842 1727204491.84118: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 44842 1727204491.84501: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.84861: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 44842 1727204491.84871: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.84910: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.84961: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.84986: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85023: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 44842 1727204491.85029: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85055: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85087: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 44842 1727204491.85141: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85192: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 44842 1727204491.85199: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85218: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85254: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 44842 1727204491.85326: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 44842 1727204491.85386: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85458: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 44842 1727204491.85486: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36438ee0> <<< 44842 1727204491.85501: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 44842 1727204491.85532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 44842 1727204491.85687: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc364389d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 44842 1727204491.85750: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85807: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 44842 1727204491.85815: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85888: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.85970: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 44842 1727204491.86029: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.86099: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 44842 1727204491.86106: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.86133: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.86186: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 44842 1727204491.86196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 44842 1727204491.86341: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36462040> <<< 44842 1727204491.86579: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367708e0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 44842 1727204491.86627: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.86680: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 44842 1727204491.86754: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.86822: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.86922: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87054: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 44842 1727204491.87096: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87124: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 44842 1727204491.87143: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87175: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87220: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 44842 1727204491.87282: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc364b4df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc364b4580> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 44842 1727204491.87303: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44842 1727204491.87315: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 44842 1727204491.87356: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87402: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 44842 1727204491.87528: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87659: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 44842 1727204491.87744: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87827: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87868: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.87899: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 44842 1727204491.87914: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 44842 1727204491.87993: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.88008: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.88123: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.88251: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 44842 1727204491.88355: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.88459: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 44842 1727204491.88495: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.88523: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.88962: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.89394: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 44842 1727204491.89484: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.89579: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 44842 1727204491.89586: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.89665: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.89748: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 44842 1727204491.89879: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.90015: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 44842 1727204491.90036: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 44842 1727204491.90045: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.90086: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.90127: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 44842 1727204491.90134: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.90213: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.90294: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.90468: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.90634: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 44842 1727204491.90641: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91070: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91073: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 44842 1727204491.91075: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91077: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91078: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 44842 1727204491.91080: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91082: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 44842 1727204491.91083: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91085: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91087: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 44842 1727204491.91088: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91090: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 44842 1727204491.91094: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91108: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91165: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 44842 1727204491.91385: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91599: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 44842 1727204491.91606: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91654: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91970: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 44842 1727204491.91974: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91976: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 44842 1727204491.91979: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91981: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91983: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 44842 1727204491.91985: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91987: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 44842 1727204491.91989: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.91990: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92069: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 44842 1727204491.92080: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 44842 1727204491.92099: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92134: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92183: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 44842 1727204491.92204: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92218: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92267: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92304: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92367: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92441: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 44842 1727204491.92448: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92489: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92536: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 44842 1727204491.92700: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92868: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 44842 1727204491.92872: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92905: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.92950: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 44842 1727204491.93001: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.93041: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 44842 1727204491.93049: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.93115: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.93191: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 44842 1727204491.93270: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.93347: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 44842 1727204491.93354: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 44842 1727204491.93415: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204491.93644: stdout chunk (state=3): >>>import 'gc' # <<< 44842 1727204491.93955: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 44842 1727204491.93988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 44842 1727204491.93995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 44842 1727204491.94037: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36264ee0> <<< 44842 1727204491.94045: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36227280> <<< 44842 1727204491.94107: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36227190> <<< 44842 1727204491.95069: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "31", "epoch": "1727204491", "epoch_int": "1727204491", "date": "2024-09-24", "time": "15:01:31", "iso8601_micro": "2024-09-24T19:01:31.944258Z", "iso8601": "2024-09-24T19:01:31Z", "iso8601_basic": "20240924T150131944258", "iso8601_basic_short": "20240924T150131", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44842 1727204491.95497: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 44842 1727204491.95573: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq <<< 44842 1727204491.95613: stdout chunk (state=3): >>># cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil <<< 44842 1727204491.95647: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 44842 1727204491.95673: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters <<< 44842 1727204491.95689: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime <<< 44842 1727204491.95704: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl <<< 44842 1727204491.95726: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd <<< 44842 1727204491.95751: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform <<< 44842 1727204491.95764: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 44842 1727204491.96030: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 44842 1727204491.96041: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 44842 1727204491.96066: stdout chunk (state=3): >>># destroy zipimport <<< 44842 1727204491.96082: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 44842 1727204491.96114: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 44842 1727204491.96123: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy encodings <<< 44842 1727204491.96140: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 44842 1727204491.96179: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 44842 1727204491.96209: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 44842 1727204491.96250: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 44842 1727204491.96267: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex <<< 44842 1727204491.96291: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 44842 1727204491.96316: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 44842 1727204491.96338: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 44842 1727204491.96625: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 44842 1727204491.96634: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 44842 1727204491.96803: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq <<< 44842 1727204491.96817: stdout chunk (state=3): >>># destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 44842 1727204491.96847: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator <<< 44842 1727204491.96857: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 44842 1727204491.96899: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 44842 1727204491.97253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204491.97256: stdout chunk (state=3): >>><<< 44842 1727204491.97270: stderr chunk (state=3): >>><<< 44842 1727204491.97413: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37898dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37898b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37898ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37855880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3783d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3756ef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3756f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3756e3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37455e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37455970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37455f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37455dc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37549df0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc375426d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37555730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37575e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37465d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37549310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37555340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3757ba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37439430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37439520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3746dfa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37468af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc374684c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3736d280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37424dc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37468f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3757b0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3737ebb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc3737eee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc373907f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37390d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37329460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3737efd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37339340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37390670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37339400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37355820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37355d60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc373602b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc373559a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37349af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37465670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37355b50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7efc3728a730> # zipimport: found 103 names in '/tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc371c7160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7fd0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c74f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7df0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc371c7580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c7100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3715c070> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36b893a0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36b890a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36b89d00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371afdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371af3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371aff40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371fee80> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37185d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37185460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371c4ac0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37185580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371855b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36bf4f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc372112b0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36bf17f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37211430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37211c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36bf1790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37211100> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc372115b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37211f70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37209970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36be78e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc37120df0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36bf0520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36be7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36bf0940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc3711b790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3715a850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36791fd0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3718d2e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371cdca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc3713ec40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc371cdbe0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37151910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc3719bb50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36603820> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc364ee100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36752a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36752a00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36725dc0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36725790> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367704c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36770d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36735ee0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367359d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367351f0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36550280> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc37219a30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36770070> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36438ee0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc364389d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36462040> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc367708e0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc364b4df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc364b4580> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_5iwldf7_/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc36264ee0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36227280> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc36227190> {"ansible_facts": {"ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "31", "epoch": "1727204491", "epoch_int": "1727204491", "date": "2024-09-24", "time": "15:01:31", "iso8601_micro": "2024-09-24T19:01:31.944258Z", "iso8601": "2024-09-24T19:01:31Z", "iso8601_basic": "20240924T150131944258", "iso8601_basic_short": "20240924T150131", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 44842 1727204491.98531: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204491.98536: _low_level_execute_command(): starting 44842 1727204491.98538: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204491.4871-45004-2051659729778/ > /dev/null 2>&1 && sleep 0' 44842 1727204491.98851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204491.98855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.98857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.98860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.98959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.98965: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204491.98967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.98970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204491.98972: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204491.98974: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204491.98976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204491.98979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204491.98981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204491.98983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204491.98985: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204491.98986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204491.99046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204491.99067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204491.99081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204491.99171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204492.01018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204492.01025: stdout chunk (state=3): >>><<< 44842 1727204492.01027: stderr chunk (state=3): >>><<< 44842 1727204492.01044: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204492.01050: handler run complete 44842 1727204492.01104: variable 'ansible_facts' from source: unknown 44842 1727204492.01157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.01277: variable 'ansible_facts' from source: unknown 44842 1727204492.01322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.01378: attempt loop complete, returning result 44842 1727204492.01381: _execute() done 44842 1727204492.01383: dumping result to json 44842 1727204492.01395: done dumping result, returning 44842 1727204492.01404: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-aad0-d242-0000000000c0] 44842 1727204492.01409: sending task result for task 0affcd87-79f5-aad0-d242-0000000000c0 44842 1727204492.01874: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000c0 44842 1727204492.01877: WORKER PROCESS EXITING ok: [managed-node1] 44842 1727204492.01996: no more pending results, returning what we have 44842 1727204492.02000: results queue empty 44842 1727204492.02001: checking for any_errors_fatal 44842 1727204492.02002: done checking for any_errors_fatal 44842 1727204492.02003: checking for max_fail_percentage 44842 1727204492.02004: done checking for max_fail_percentage 44842 1727204492.02005: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.02006: done checking to see if all hosts have failed 44842 1727204492.02006: getting the remaining hosts for this loop 44842 1727204492.02008: done getting the remaining hosts for this loop 44842 1727204492.02012: getting the next task for host managed-node1 44842 1727204492.02023: done getting next task for host managed-node1 44842 1727204492.02026: ^ task is: TASK: Check if system is ostree 44842 1727204492.02029: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.02033: getting variables 44842 1727204492.02034: in VariableManager get_vars() 44842 1727204492.02075: Calling all_inventory to load vars for managed-node1 44842 1727204492.02079: Calling groups_inventory to load vars for managed-node1 44842 1727204492.02083: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.02094: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.02097: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.02101: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.02314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.02511: done with get_vars() 44842 1727204492.02522: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.664) 0:00:02.195 ***** 44842 1727204492.02713: entering _queue_task() for managed-node1/stat 44842 1727204492.03026: worker is 1 (out of 1 available) 44842 1727204492.03039: exiting _queue_task() for managed-node1/stat 44842 1727204492.03058: done queuing things up, now waiting for results queue to drain 44842 1727204492.03060: waiting for pending results... 44842 1727204492.03317: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 44842 1727204492.03433: in run() - task 0affcd87-79f5-aad0-d242-0000000000c2 44842 1727204492.03452: variable 'ansible_search_path' from source: unknown 44842 1727204492.03459: variable 'ansible_search_path' from source: unknown 44842 1727204492.03509: calling self._execute() 44842 1727204492.03590: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.03610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.03623: variable 'omit' from source: magic vars 44842 1727204492.04114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204492.04395: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204492.04445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204492.04491: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204492.04530: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204492.04650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204492.04682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204492.04723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204492.04752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204492.04888: Evaluated conditional (not __network_is_ostree is defined): True 44842 1727204492.04899: variable 'omit' from source: magic vars 44842 1727204492.04950: variable 'omit' from source: magic vars 44842 1727204492.04993: variable 'omit' from source: magic vars 44842 1727204492.05030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204492.05060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204492.05086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204492.05106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204492.05242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204492.05281: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204492.05289: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.05296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.05503: Set connection var ansible_shell_type to sh 44842 1727204492.05517: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204492.05526: Set connection var ansible_connection to ssh 44842 1727204492.05534: Set connection var ansible_pipelining to False 44842 1727204492.05543: Set connection var ansible_timeout to 10 44842 1727204492.05556: Set connection var ansible_shell_executable to /bin/sh 44842 1727204492.05699: variable 'ansible_shell_executable' from source: unknown 44842 1727204492.05707: variable 'ansible_connection' from source: unknown 44842 1727204492.05714: variable 'ansible_module_compression' from source: unknown 44842 1727204492.05720: variable 'ansible_shell_type' from source: unknown 44842 1727204492.05726: variable 'ansible_shell_executable' from source: unknown 44842 1727204492.05732: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.05739: variable 'ansible_pipelining' from source: unknown 44842 1727204492.05745: variable 'ansible_timeout' from source: unknown 44842 1727204492.05751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.06080: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204492.06093: variable 'omit' from source: magic vars 44842 1727204492.06100: starting attempt loop 44842 1727204492.06110: running the handler 44842 1727204492.06132: _low_level_execute_command(): starting 44842 1727204492.06142: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204492.07295: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204492.07315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.07331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.07358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.07407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204492.07424: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204492.07440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.07470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204492.07484: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204492.07496: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204492.07508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.07523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.07544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.07568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204492.07582: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204492.07597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.07685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204492.07708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204492.07724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204492.07816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204492.09707: stdout chunk (state=3): >>>/root <<< 44842 1727204492.09896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204492.09900: stdout chunk (state=3): >>><<< 44842 1727204492.09904: stderr chunk (state=3): >>><<< 44842 1727204492.10040: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204492.10051: _low_level_execute_command(): starting 44842 1727204492.10055: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687 `" && echo ansible-tmp-1727204492.0994155-45034-212500635337687="` echo /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687 `" ) && sleep 0' 44842 1727204492.10749: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204492.10766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.10782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.10800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.10855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204492.10868: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204492.10887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.10905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204492.10916: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204492.10927: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204492.10951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.10970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.10986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.10998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204492.11009: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204492.11023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.11108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204492.11129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204492.11143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204492.11228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204492.13112: stdout chunk (state=3): >>>ansible-tmp-1727204492.0994155-45034-212500635337687=/root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687 <<< 44842 1727204492.13338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204492.13342: stdout chunk (state=3): >>><<< 44842 1727204492.13345: stderr chunk (state=3): >>><<< 44842 1727204492.13578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204492.0994155-45034-212500635337687=/root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204492.13582: variable 'ansible_module_compression' from source: unknown 44842 1727204492.13585: ANSIBALLZ: Using lock for stat 44842 1727204492.13587: ANSIBALLZ: Acquiring lock 44842 1727204492.13589: ANSIBALLZ: Lock acquired: 140164881037648 44842 1727204492.13591: ANSIBALLZ: Creating module 44842 1727204492.30454: ANSIBALLZ: Writing module into payload 44842 1727204492.30540: ANSIBALLZ: Writing module 44842 1727204492.30560: ANSIBALLZ: Renaming module 44842 1727204492.30568: ANSIBALLZ: Done creating module 44842 1727204492.30586: variable 'ansible_facts' from source: unknown 44842 1727204492.30656: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687/AnsiballZ_stat.py 44842 1727204492.30819: Sending initial data 44842 1727204492.30828: Sent initial data (153 bytes) 44842 1727204492.31765: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.31770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.31816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.31819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.31821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.31888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204492.31903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204492.32000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204492.34519: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204492.34570: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204492.34629: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp3jer2ezk /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687/AnsiballZ_stat.py <<< 44842 1727204492.34683: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204492.35565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204492.35695: stderr chunk (state=3): >>><<< 44842 1727204492.35698: stdout chunk (state=3): >>><<< 44842 1727204492.35719: done transferring module to remote 44842 1727204492.35732: _low_level_execute_command(): starting 44842 1727204492.35737: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687/ /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687/AnsiballZ_stat.py && sleep 0' 44842 1727204492.36220: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.36225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.36259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.36287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204492.36290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.36292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.36347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204492.36350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204492.36354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204492.36411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204492.38471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204492.38522: stderr chunk (state=3): >>><<< 44842 1727204492.38525: stdout chunk (state=3): >>><<< 44842 1727204492.38543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204492.38546: _low_level_execute_command(): starting 44842 1727204492.38551: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687/AnsiballZ_stat.py && sleep 0' 44842 1727204492.39132: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.39136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.39173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.39176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.39178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.39228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204492.39234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204492.39310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204492.42313: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81df3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81df3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81df3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 44842 1727204492.42325: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98490> <<< 44842 1727204492.42352: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 44842 1727204492.42387: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 44842 1727204492.42418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98940> <<< 44842 1727204492.42477: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98670> <<< 44842 1727204492.42493: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 44842 1727204492.42496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 44842 1727204492.42573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 44842 1727204492.42629: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 44842 1727204492.42643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 44842 1727204492.42708: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d4f220> <<< 44842 1727204492.42802: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d4f940> <<< 44842 1727204492.42858: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81db0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d48d90> <<< 44842 1727204492.42925: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d72d90> <<< 44842 1727204492.42983: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98970> <<< 44842 1727204492.42999: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44842 1727204492.43188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 44842 1727204492.43218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 44842 1727204492.43239: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 44842 1727204492.43272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 44842 1727204492.43306: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 44842 1727204492.43309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81ceef10> <<< 44842 1727204492.43357: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cf40a0> <<< 44842 1727204492.43379: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 44842 1727204492.43391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 44842 1727204492.43397: stdout chunk (state=3): >>>import '_sre' # <<< 44842 1727204492.43426: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 44842 1727204492.43431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 44842 1727204492.43456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 44842 1727204492.43489: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81ce75b0> <<< 44842 1727204492.43501: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cef6a0> <<< 44842 1727204492.43504: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cee3d0> <<< 44842 1727204492.43525: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 44842 1727204492.43597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 44842 1727204492.43616: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 44842 1727204492.43654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204492.43672: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 44842 1727204492.43677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 44842 1727204492.43748: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81c71e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c71970> <<< 44842 1727204492.43752: stdout chunk (state=3): >>>import 'itertools' # <<< 44842 1727204492.43755: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c71f70> <<< 44842 1727204492.43809: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 44842 1727204492.43841: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c71dc0> <<< 44842 1727204492.43856: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81130> <<< 44842 1727204492.43883: stdout chunk (state=3): >>>import '_collections' # <<< 44842 1727204492.43935: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cc9df0> <<< 44842 1727204492.43939: stdout chunk (state=3): >>>import '_functools' # <<< 44842 1727204492.43950: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cc26d0> <<< 44842 1727204492.44029: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cd5730> <<< 44842 1727204492.44033: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cf5e80> <<< 44842 1727204492.44047: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 44842 1727204492.44082: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81c81d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cc9310> <<< 44842 1727204492.44161: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204492.44167: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81cd5340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cfba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 44842 1727204492.44170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 44842 1727204492.44189: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 44842 1727204492.44222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81e50> <<< 44842 1727204492.44251: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 44842 1727204492.44278: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81dc0> <<< 44842 1727204492.44281: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 44842 1727204492.44292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 44842 1727204492.44298: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 44842 1727204492.44321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 44842 1727204492.44330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 44842 1727204492.44386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 44842 1727204492.44417: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 44842 1727204492.44420: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c55430> <<< 44842 1727204492.44438: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 44842 1727204492.44443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 44842 1727204492.44487: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c55520> <<< 44842 1727204492.44904: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c8afa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c84af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c844c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81964280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c40dc0> <<< 44842 1727204492.45019: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c84f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cfb0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81974bb0> import 'errno' # <<< 44842 1727204492.45395: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81974ee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e819877f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81987d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81914460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81974fd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81924340> <<< 44842 1727204492.45435: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81987670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81924400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81a90><<< 44842 1727204492.45441: stdout chunk (state=3): >>> <<< 44842 1727204492.45456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 44842 1727204492.45484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 44842 1727204492.45525: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 44842 1727204492.45528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 44842 1727204492.45561: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940760> <<< 44842 1727204492.45587: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 44842 1727204492.45628: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81940820> <<< 44842 1727204492.45659: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940910> <<< 44842 1727204492.45703: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 44842 1727204492.46015: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204492.46018: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940d60> <<< 44842 1727204492.46104: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204492.46127: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8194b2b0> <<< 44842 1727204492.46130: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e819409a0> <<< 44842 1727204492.46166: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81934af0><<< 44842 1727204492.46176: stdout chunk (state=3): >>> <<< 44842 1727204492.46204: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81670> <<< 44842 1727204492.46244: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py<<< 44842 1727204492.46247: stdout chunk (state=3): >>> <<< 44842 1727204492.46328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 44842 1727204492.46391: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81940b50> <<< 44842 1727204492.46533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 44842 1727204492.46565: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1e8185b730><<< 44842 1727204492.46578: stdout chunk (state=3): >>> <<< 44842 1727204492.46979: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip' # zipimport: zlib available <<< 44842 1727204492.47118: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.47184: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 44842 1727204492.47207: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.47219: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 44842 1727204492.48586: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.50151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py<<< 44842 1727204492.50155: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 44842 1727204492.50157: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782880> <<< 44842 1727204492.50214: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 44842 1727204492.50217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204492.50256: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 44842 1727204492.50276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 44842 1727204492.50321: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 44842 1727204492.50324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 44842 1727204492.50360: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204492.50398: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81782160> <<< 44842 1727204492.50467: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782280> <<< 44842 1727204492.50521: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782fd0> <<< 44842 1727204492.50565: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 44842 1727204492.50583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 44842 1727204492.50671: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817824f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782df0><<< 44842 1727204492.50687: stdout chunk (state=3): >>> <<< 44842 1727204492.50690: stdout chunk (state=3): >>>import 'atexit' # <<< 44842 1727204492.50751: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so'<<< 44842 1727204492.50754: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81782580> <<< 44842 1727204492.50791: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 44842 1727204492.50834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 44842 1727204492.50903: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782100> <<< 44842 1727204492.50938: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py<<< 44842 1727204492.50948: stdout chunk (state=3): >>> <<< 44842 1727204492.50977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 44842 1727204492.51012: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 44842 1727204492.51049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 44842 1727204492.51101: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py<<< 44842 1727204492.51106: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 44842 1727204492.51228: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e811adfa0> <<< 44842 1727204492.51311: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e811cbc70><<< 44842 1727204492.51314: stdout chunk (state=3): >>> <<< 44842 1727204492.51374: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e811cbf70><<< 44842 1727204492.51377: stdout chunk (state=3): >>> <<< 44842 1727204492.51416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 44842 1727204492.51458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 44842 1727204492.51514: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e811cb310> <<< 44842 1727204492.51543: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e9dc0> <<< 44842 1727204492.51828: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e93a0> <<< 44842 1727204492.51875: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py<<< 44842 1727204492.51890: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 44842 1727204492.51918: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e9f40> <<< 44842 1727204492.51950: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py<<< 44842 1727204492.51980: stdout chunk (state=3): >>> <<< 44842 1727204492.52020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py<<< 44842 1727204492.52031: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 44842 1727204492.52065: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 44842 1727204492.52095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 44842 1727204492.52135: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py<<< 44842 1727204492.52160: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817b9e80><<< 44842 1727204492.52172: stdout chunk (state=3): >>> <<< 44842 1727204492.52295: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81755d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81755460> <<< 44842 1727204492.52330: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8178c550> <<< 44842 1727204492.52383: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204492.52396: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81755580> <<< 44842 1727204492.52461: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817555b0> <<< 44842 1727204492.52521: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 44842 1727204492.52534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 44842 1727204492.52573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 44842 1727204492.52614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 44842 1727204492.52733: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119ef70><<< 44842 1727204492.52756: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817cb2b0> <<< 44842 1727204492.52793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 44842 1727204492.52817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 44842 1727204492.52924: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119b7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817cb430> <<< 44842 1727204492.52974: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 44842 1727204492.53042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204492.53087: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py<<< 44842 1727204492.53115: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 44842 1727204492.53212: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e2e80> <<< 44842 1727204492.53437: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8119b790> <<< 44842 1727204492.53584: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119b5e0> <<< 44842 1727204492.53652: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119a550><<< 44842 1727204492.53663: stdout chunk (state=3): >>> <<< 44842 1727204492.53741: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119a490><<< 44842 1727204492.53768: stdout chunk (state=3): >>> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817c1970><<< 44842 1727204492.53779: stdout chunk (state=3): >>> <<< 44842 1727204492.53822: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 44842 1727204492.53860: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 44842 1727204492.53890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc'<<< 44842 1727204492.53905: stdout chunk (state=3): >>> <<< 44842 1727204492.53980: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 44842 1727204492.53992: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8174b6a0> <<< 44842 1727204492.54314: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so'<<< 44842 1727204492.54346: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8174ab80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8175b0a0><<< 44842 1727204492.54357: stdout chunk (state=3): >>> <<< 44842 1727204492.54423: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so'<<< 44842 1727204492.54435: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8174b100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8178ebe0> <<< 44842 1727204492.54481: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.54506: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.54531: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 44842 1727204492.54545: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.54656: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.54788: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.54818: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 44842 1727204492.54844: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.54887: stdout chunk (state=3): >>> # zipimport: zlib available <<< 44842 1727204492.54913: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 44842 1727204492.54928: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.55085: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.55252: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.56007: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.56724: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 44842 1727204492.56765: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 44842 1727204492.56781: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 44842 1727204492.56813: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 44842 1727204492.56841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204492.56941: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 44842 1727204492.56958: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81167ac0> <<< 44842 1727204492.57055: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc'<<< 44842 1727204492.57081: stdout chunk (state=3): >>> <<< 44842 1727204492.57111: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81749d00> <<< 44842 1727204492.57122: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8173f850> <<< 44842 1727204492.57182: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 44842 1727204492.57222: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.57244: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.57288: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 44842 1727204492.57308: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.57501: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.57717: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py<<< 44842 1727204492.57720: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 44842 1727204492.57772: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8174a9d0><<< 44842 1727204492.57790: stdout chunk (state=3): >>> <<< 44842 1727204492.57797: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.57804: stdout chunk (state=3): >>> <<< 44842 1727204492.58445: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.58451: stdout chunk (state=3): >>> <<< 44842 1727204492.59077: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.59083: stdout chunk (state=3): >>> <<< 44842 1727204492.59171: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.59178: stdout chunk (state=3): >>> <<< 44842 1727204492.59289: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/collections.py<<< 44842 1727204492.59292: stdout chunk (state=3): >>> <<< 44842 1727204492.59295: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.59352: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.59358: stdout chunk (state=3): >>> <<< 44842 1727204492.59419: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py<<< 44842 1727204492.59423: stdout chunk (state=3): >>> <<< 44842 1727204492.59425: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.59529: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.59536: stdout chunk (state=3): >>> <<< 44842 1727204492.59659: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/errors.py<<< 44842 1727204492.59668: stdout chunk (state=3): >>> <<< 44842 1727204492.59670: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.59691: stdout chunk (state=3): >>> <<< 44842 1727204492.59709: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.59712: stdout chunk (state=3): >>> <<< 44842 1727204492.59714: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py<<< 44842 1727204492.59716: stdout chunk (state=3): >>> <<< 44842 1727204492.59742: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.59804: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.59809: stdout chunk (state=3): >>> <<< 44842 1727204492.59855: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 44842 1727204492.59881: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.60204: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.60522: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 44842 1727204492.60585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 44842 1727204492.60593: stdout chunk (state=3): >>>import '_ast' # <<< 44842 1727204492.60600: stdout chunk (state=3): >>> <<< 44842 1727204492.60731: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e80d70310> <<< 44842 1727204492.60734: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.60830: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.60950: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py<<< 44842 1727204492.60957: stdout chunk (state=3): >>> import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/validation.py<<< 44842 1727204492.60963: stdout chunk (state=3): >>> import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py<<< 44842 1727204492.60966: stdout chunk (state=3): >>> <<< 44842 1727204492.60986: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 44842 1727204492.61015: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.61019: stdout chunk (state=3): >>> <<< 44842 1727204492.61089: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.61092: stdout chunk (state=3): >>> <<< 44842 1727204492.61144: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 44842 1727204492.61158: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.61221: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.61224: stdout chunk (state=3): >>> <<< 44842 1727204492.61284: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.61422: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.61517: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 44842 1727204492.61569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 44842 1727204492.61689: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e817d32b0> <<< 44842 1727204492.61752: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817497c0> <<< 44842 1727204492.61828: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 44842 1727204492.61832: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 44842 1727204492.61845: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.62015: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.62019: stdout chunk (state=3): >>> <<< 44842 1727204492.62099: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.62102: stdout chunk (state=3): >>> <<< 44842 1727204492.62137: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.62215: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 44842 1727204492.62228: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 44842 1727204492.62269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 44842 1727204492.62324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 44842 1727204492.62362: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py<<< 44842 1727204492.62383: stdout chunk (state=3): >>> <<< 44842 1727204492.62396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 44842 1727204492.62546: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e80d52760> <<< 44842 1727204492.62609: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8115d610> <<< 44842 1727204492.62694: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8115cb80><<< 44842 1727204492.62702: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py<<< 44842 1727204492.62721: stdout chunk (state=3): >>> <<< 44842 1727204492.62736: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.62738: stdout chunk (state=3): >>> <<< 44842 1727204492.62782: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.62824: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py<<< 44842 1727204492.62829: stdout chunk (state=3): >>> <<< 44842 1727204492.62831: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py<<< 44842 1727204492.62833: stdout chunk (state=3): >>> <<< 44842 1727204492.62935: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/basic.py<<< 44842 1727204492.62953: stdout chunk (state=3): >>> <<< 44842 1727204492.62957: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.62980: stdout chunk (state=3): >>> <<< 44842 1727204492.62998: stdout chunk (state=3): >>># zipimport: zlib available <<< 44842 1727204492.63002: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/modules/__init__.py<<< 44842 1727204492.63007: stdout chunk (state=3): >>> <<< 44842 1727204492.63030: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.63035: stdout chunk (state=3): >>> <<< 44842 1727204492.63211: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.63216: stdout chunk (state=3): >>> <<< 44842 1727204492.63475: stdout chunk (state=3): >>># zipimport: zlib available<<< 44842 1727204492.63480: stdout chunk (state=3): >>> <<< 44842 1727204492.63650: stdout chunk (state=3): >>> <<< 44842 1727204492.63655: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 44842 1727204492.63688: stdout chunk (state=3): >>># destroy __main__ <<< 44842 1727204492.64018: stdout chunk (state=3): >>># clear builtins._ <<< 44842 1727204492.64044: stdout chunk (state=3): >>># clear sys.path # clear sys.argv<<< 44842 1727204492.64082: stdout chunk (state=3): >>> # clear sys.ps1 <<< 44842 1727204492.64086: stdout chunk (state=3): >>># clear sys.ps2<<< 44842 1727204492.64099: stdout chunk (state=3): >>> <<< 44842 1727204492.64108: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value <<< 44842 1727204492.64115: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.path_hooks <<< 44842 1727204492.64121: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 44842 1727204492.64123: stdout chunk (state=3): >>> <<< 44842 1727204492.64128: stdout chunk (state=3): >>># clear sys.meta_path <<< 44842 1727204492.64149: stdout chunk (state=3): >>># clear sys.__interactivehook__ # restore sys.stdin <<< 44842 1727204492.64187: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys <<< 44842 1727204492.64219: stdout chunk (state=3): >>># cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 44842 1727204492.64269: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal<<< 44842 1727204492.64308: stdout chunk (state=3): >>> # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport<<< 44842 1727204492.64328: stdout chunk (state=3): >>> # cleanup[2] removing _codecs # cleanup[2] removing codecs <<< 44842 1727204492.64349: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8<<< 44842 1727204492.64385: stdout chunk (state=3): >>> # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1<<< 44842 1727204492.64549: stdout chunk (state=3): >>> # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale <<< 44842 1727204492.64591: stdout chunk (state=3): >>># destroy _bootlocale # cleanup[2] removing site <<< 44842 1727204492.64613: stdout chunk (state=3): >>># destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre<<< 44842 1727204492.64638: stdout chunk (state=3): >>> # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse<<< 44842 1727204492.64672: stdout chunk (state=3): >>> # cleanup[2] removing sre_compile <<< 44842 1727204492.64697: stdout chunk (state=3): >>># cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools <<< 44842 1727204492.64727: stdout chunk (state=3): >>># cleanup[2] removing keyword # destroy keyword <<< 44842 1727204492.64748: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator <<< 44842 1727204492.64780: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections<<< 44842 1727204492.64802: stdout chunk (state=3): >>> # cleanup[2] removing collections # cleanup[2] removing _functools<<< 44842 1727204492.64829: stdout chunk (state=3): >>> # cleanup[2] removing functools # cleanup[2] removing copyreg<<< 44842 1727204492.64851: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct <<< 44842 1727204492.64890: stdout chunk (state=3): >>># cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64<<< 44842 1727204492.64914: stdout chunk (state=3): >>> # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external<<< 44842 1727204492.64940: stdout chunk (state=3): >>> # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery<<< 44842 1727204492.64962: stdout chunk (state=3): >>> # cleanup[2] removing collections.abc<<< 44842 1727204492.64989: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing<<< 44842 1727204492.65015: stdout chunk (state=3): >>> # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util<<< 44842 1727204492.65037: stdout chunk (state=3): >>> # cleanup[2] removing _weakrefset # destroy _weakrefset <<< 44842 1727204492.65056: stdout chunk (state=3): >>># cleanup[2] removing weakref # cleanup[2] removing pkgutil<<< 44842 1727204492.65075: stdout chunk (state=3): >>> # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib<<< 44842 1727204492.65095: stdout chunk (state=3): >>> # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math<<< 44842 1727204492.65117: stdout chunk (state=3): >>> # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 44842 1727204492.65132: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal <<< 44842 1727204492.65166: stdout chunk (state=3): >>># cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 44842 1727204492.65181: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging <<< 44842 1727204492.65202: stdout chunk (state=3): >>># cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat<<< 44842 1727204492.65223: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters<<< 44842 1727204492.65239: stdout chunk (state=3): >>> # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing<<< 44842 1727204492.65263: stdout chunk (state=3): >>> # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters<<< 44842 1727204492.65285: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process<<< 44842 1727204492.65301: stdout chunk (state=3): >>> # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules<<< 44842 1727204492.65551: stdout chunk (state=3): >>> # destroy _sitebuiltins<<< 44842 1727204492.65577: stdout chunk (state=3): >>> # destroy importlib.util # destroy importlib.abc<<< 44842 1727204492.65596: stdout chunk (state=3): >>> # destroy importlib.machinery <<< 44842 1727204492.65633: stdout chunk (state=3): >>># destroy zipimport<<< 44842 1727204492.65659: stdout chunk (state=3): >>> # destroy _compression <<< 44842 1727204492.65688: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct<<< 44842 1727204492.65713: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma <<< 44842 1727204492.65748: stdout chunk (state=3): >>># destroy __main__<<< 44842 1727204492.65782: stdout chunk (state=3): >>> # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon <<< 44842 1727204492.65804: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 44842 1727204492.65829: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder <<< 44842 1727204492.65854: stdout chunk (state=3): >>># destroy json.scanner <<< 44842 1727204492.65877: stdout chunk (state=3): >>># destroy _json # destroy encodings<<< 44842 1727204492.65897: stdout chunk (state=3): >>> <<< 44842 1727204492.65917: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 44842 1727204492.65941: stdout chunk (state=3): >>># destroy array <<< 44842 1727204492.65965: stdout chunk (state=3): >>># destroy datetime <<< 44842 1727204492.65993: stdout chunk (state=3): >>># destroy selinux <<< 44842 1727204492.66016: stdout chunk (state=3): >>># destroy distro # destroy json # destroy shlex # destroy logging<<< 44842 1727204492.66032: stdout chunk (state=3): >>> # destroy argparse <<< 44842 1727204492.66362: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 44842 1727204492.66421: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random<<< 44842 1727204492.66440: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math<<< 44842 1727204492.66462: stdout chunk (state=3): >>> # cleanup[3] wiping shutil # destroy fnmatch<<< 44842 1727204492.66479: stdout chunk (state=3): >>> <<< 44842 1727204492.66505: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading<<< 44842 1727204492.66516: stdout chunk (state=3): >>> # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc<<< 44842 1727204492.66531: stdout chunk (state=3): >>> # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 44842 1727204492.66550: stdout chunk (state=3): >>> # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools<<< 44842 1727204492.66566: stdout chunk (state=3): >>> # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 44842 1727204492.66586: stdout chunk (state=3): >>> # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq<<< 44842 1727204492.66598: stdout chunk (state=3): >>> # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os<<< 44842 1727204492.66628: stdout chunk (state=3): >>> # cleanup[3] wiping os.path<<< 44842 1727204492.66635: stdout chunk (state=3): >>> # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 44842 1727204492.66650: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 44842 1727204492.66668: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 44842 1727204492.66680: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp<<< 44842 1727204492.66698: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon<<< 44842 1727204492.66709: stdout chunk (state=3): >>> # destroy _socket # destroy systemd.id128<<< 44842 1727204492.66893: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform<<< 44842 1727204492.66913: stdout chunk (state=3): >>> # destroy _uuid # destroy _sre<<< 44842 1727204492.66924: stdout chunk (state=3): >>> # destroy sre_parse <<< 44842 1727204492.66951: stdout chunk (state=3): >>># destroy tokenize <<< 44842 1727204492.66966: stdout chunk (state=3): >>># destroy _heapq <<< 44842 1727204492.66991: stdout chunk (state=3): >>># destroy posixpath <<< 44842 1727204492.67000: stdout chunk (state=3): >>># destroy stat <<< 44842 1727204492.67017: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 44842 1727204492.67041: stdout chunk (state=3): >>># destroy errno # destroy signal<<< 44842 1727204492.67069: stdout chunk (state=3): >>> # destroy contextlib # destroy pwd <<< 44842 1727204492.67088: stdout chunk (state=3): >>># destroy grp # destroy _posixsubprocess # destroy selectors<<< 44842 1727204492.67104: stdout chunk (state=3): >>> <<< 44842 1727204492.67135: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse<<< 44842 1727204492.67152: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request <<< 44842 1727204492.67198: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools <<< 44842 1727204492.67214: stdout chunk (state=3): >>># destroy itertools # destroy operator # destroy ansible.module_utils.six.moves<<< 44842 1727204492.67230: stdout chunk (state=3): >>> # destroy _operator <<< 44842 1727204492.67247: stdout chunk (state=3): >>># destroy _frozen_importlib_external<<< 44842 1727204492.67280: stdout chunk (state=3): >>> # destroy _imp<<< 44842 1727204492.67287: stdout chunk (state=3): >>> # destroy io # destroy marshal <<< 44842 1727204492.67336: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 44842 1727204492.67744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204492.67801: stderr chunk (state=3): >>><<< 44842 1727204492.67805: stdout chunk (state=3): >>><<< 44842 1727204492.67876: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81df3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81df3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81df3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d4f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d4f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81db0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d48d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d72d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81d98970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81ceef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cf40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81ce75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cef6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cee3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81c71e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c71970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c71f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c71dc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cc9df0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cc26d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cd5730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cf5e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81c81d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cc9310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81cd5340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cfba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c55430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c55520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c8afa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c84af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c844c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81964280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c40dc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c84f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81cfb0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81974bb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81974ee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e819877f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81987d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81914460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81974fd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81924340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81987670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81924400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940a30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81940820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81940d60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8194b2b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e819409a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81934af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81c81670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81940b50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1e8185b730> # zipimport: found 30 names in '/tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81782160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782fd0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817824f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782df0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81782580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81782100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e811adfa0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e811cbc70> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e811cbf70> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e811cb310> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e9dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e93a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e9f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817b9e80> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81755d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81755460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8178c550> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81755580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817555b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119ef70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817cb2b0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119b7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817cb430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817e2e80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8119b790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119b5e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119a550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8119a490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817c1970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8174b6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8174ab80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8175b0a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e8174b100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8178ebe0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e81167ac0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e81749d00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8173f850> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8174a9d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e80d70310> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1e817d32b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e817497c0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e80d52760> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8115d610> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1e8115cb80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_ydgsr5gy/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 44842 1727204492.69085: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204492.69088: _low_level_execute_command(): starting 44842 1727204492.69091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204492.0994155-45034-212500635337687/ > /dev/null 2>&1 && sleep 0' 44842 1727204492.69093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204492.69095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.69097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.69099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.69101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204492.69103: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204492.69106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.69108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204492.69110: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204492.69112: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204492.69119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204492.69121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204492.69123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204492.69125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204492.69127: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204492.69129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204492.69131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204492.69133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204492.69135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204492.69137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204492.71609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204492.71656: stderr chunk (state=3): >>><<< 44842 1727204492.71664: stdout chunk (state=3): >>><<< 44842 1727204492.71682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204492.71685: handler run complete 44842 1727204492.71711: attempt loop complete, returning result 44842 1727204492.71714: _execute() done 44842 1727204492.71716: dumping result to json 44842 1727204492.71718: done dumping result, returning 44842 1727204492.71723: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [0affcd87-79f5-aad0-d242-0000000000c2] 44842 1727204492.71729: sending task result for task 0affcd87-79f5-aad0-d242-0000000000c2 44842 1727204492.71835: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000c2 44842 1727204492.71838: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 44842 1727204492.71904: no more pending results, returning what we have 44842 1727204492.71907: results queue empty 44842 1727204492.71908: checking for any_errors_fatal 44842 1727204492.71915: done checking for any_errors_fatal 44842 1727204492.71915: checking for max_fail_percentage 44842 1727204492.71917: done checking for max_fail_percentage 44842 1727204492.71917: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.71918: done checking to see if all hosts have failed 44842 1727204492.71919: getting the remaining hosts for this loop 44842 1727204492.71921: done getting the remaining hosts for this loop 44842 1727204492.71924: getting the next task for host managed-node1 44842 1727204492.71930: done getting next task for host managed-node1 44842 1727204492.71933: ^ task is: TASK: Set flag to indicate system is ostree 44842 1727204492.71935: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.71938: getting variables 44842 1727204492.71940: in VariableManager get_vars() 44842 1727204492.71974: Calling all_inventory to load vars for managed-node1 44842 1727204492.71977: Calling groups_inventory to load vars for managed-node1 44842 1727204492.71980: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.71990: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.71993: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.71995: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.72182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.72417: done with get_vars() 44842 1727204492.72436: done getting variables 44842 1727204492.72547: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.698) 0:00:02.893 ***** 44842 1727204492.72584: entering _queue_task() for managed-node1/set_fact 44842 1727204492.72586: Creating lock for set_fact 44842 1727204492.72885: worker is 1 (out of 1 available) 44842 1727204492.72896: exiting _queue_task() for managed-node1/set_fact 44842 1727204492.72907: done queuing things up, now waiting for results queue to drain 44842 1727204492.72909: waiting for pending results... 44842 1727204492.73178: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 44842 1727204492.73295: in run() - task 0affcd87-79f5-aad0-d242-0000000000c3 44842 1727204492.73321: variable 'ansible_search_path' from source: unknown 44842 1727204492.73331: variable 'ansible_search_path' from source: unknown 44842 1727204492.73383: calling self._execute() 44842 1727204492.73473: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.73483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.73495: variable 'omit' from source: magic vars 44842 1727204492.74311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204492.74648: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204492.74705: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204492.74757: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204492.74801: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204492.74908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204492.74949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204492.74985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204492.75017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204492.75170: Evaluated conditional (not __network_is_ostree is defined): True 44842 1727204492.75527: variable 'omit' from source: magic vars 44842 1727204492.75614: variable 'omit' from source: magic vars 44842 1727204492.75762: variable '__ostree_booted_stat' from source: set_fact 44842 1727204492.75827: variable 'omit' from source: magic vars 44842 1727204492.75856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204492.75888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204492.75909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204492.75943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204492.75958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204492.75991: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204492.75998: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.76006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.76187: Set connection var ansible_shell_type to sh 44842 1727204492.76202: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204492.76247: Set connection var ansible_connection to ssh 44842 1727204492.76262: Set connection var ansible_pipelining to False 44842 1727204492.76279: Set connection var ansible_timeout to 10 44842 1727204492.76294: Set connection var ansible_shell_executable to /bin/sh 44842 1727204492.76317: variable 'ansible_shell_executable' from source: unknown 44842 1727204492.76320: variable 'ansible_connection' from source: unknown 44842 1727204492.76322: variable 'ansible_module_compression' from source: unknown 44842 1727204492.76325: variable 'ansible_shell_type' from source: unknown 44842 1727204492.76327: variable 'ansible_shell_executable' from source: unknown 44842 1727204492.76329: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.76331: variable 'ansible_pipelining' from source: unknown 44842 1727204492.76333: variable 'ansible_timeout' from source: unknown 44842 1727204492.76338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.76424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204492.76432: variable 'omit' from source: magic vars 44842 1727204492.76436: starting attempt loop 44842 1727204492.76440: running the handler 44842 1727204492.76449: handler run complete 44842 1727204492.76457: attempt loop complete, returning result 44842 1727204492.76459: _execute() done 44842 1727204492.76461: dumping result to json 44842 1727204492.76469: done dumping result, returning 44842 1727204492.76475: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [0affcd87-79f5-aad0-d242-0000000000c3] 44842 1727204492.76480: sending task result for task 0affcd87-79f5-aad0-d242-0000000000c3 44842 1727204492.76554: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000c3 44842 1727204492.76557: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 44842 1727204492.76649: no more pending results, returning what we have 44842 1727204492.76653: results queue empty 44842 1727204492.76653: checking for any_errors_fatal 44842 1727204492.76659: done checking for any_errors_fatal 44842 1727204492.76659: checking for max_fail_percentage 44842 1727204492.76665: done checking for max_fail_percentage 44842 1727204492.76666: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.76667: done checking to see if all hosts have failed 44842 1727204492.76668: getting the remaining hosts for this loop 44842 1727204492.76669: done getting the remaining hosts for this loop 44842 1727204492.76673: getting the next task for host managed-node1 44842 1727204492.76682: done getting next task for host managed-node1 44842 1727204492.76684: ^ task is: TASK: Fix CentOS6 Base repo 44842 1727204492.76687: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.76691: getting variables 44842 1727204492.76692: in VariableManager get_vars() 44842 1727204492.76718: Calling all_inventory to load vars for managed-node1 44842 1727204492.76721: Calling groups_inventory to load vars for managed-node1 44842 1727204492.76724: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.76733: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.76735: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.76746: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.76900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.77024: done with get_vars() 44842 1727204492.77031: done getting variables 44842 1727204492.77121: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.045) 0:00:02.939 ***** 44842 1727204492.77141: entering _queue_task() for managed-node1/copy 44842 1727204492.77392: worker is 1 (out of 1 available) 44842 1727204492.77404: exiting _queue_task() for managed-node1/copy 44842 1727204492.77446: done queuing things up, now waiting for results queue to drain 44842 1727204492.77448: waiting for pending results... 44842 1727204492.77553: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 44842 1727204492.77650: in run() - task 0affcd87-79f5-aad0-d242-0000000000c5 44842 1727204492.77654: variable 'ansible_search_path' from source: unknown 44842 1727204492.77657: variable 'ansible_search_path' from source: unknown 44842 1727204492.77707: calling self._execute() 44842 1727204492.78049: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.78052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.78055: variable 'omit' from source: magic vars 44842 1727204492.78313: variable 'ansible_distribution' from source: facts 44842 1727204492.78353: Evaluated conditional (ansible_distribution == 'CentOS'): True 44842 1727204492.78508: variable 'ansible_distribution_major_version' from source: facts 44842 1727204492.78527: Evaluated conditional (ansible_distribution_major_version == '6'): False 44842 1727204492.78535: when evaluation is False, skipping this task 44842 1727204492.78543: _execute() done 44842 1727204492.78550: dumping result to json 44842 1727204492.78562: done dumping result, returning 44842 1727204492.78574: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [0affcd87-79f5-aad0-d242-0000000000c5] 44842 1727204492.78586: sending task result for task 0affcd87-79f5-aad0-d242-0000000000c5 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 44842 1727204492.78759: no more pending results, returning what we have 44842 1727204492.78768: results queue empty 44842 1727204492.78769: checking for any_errors_fatal 44842 1727204492.78773: done checking for any_errors_fatal 44842 1727204492.78773: checking for max_fail_percentage 44842 1727204492.78775: done checking for max_fail_percentage 44842 1727204492.78776: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.78777: done checking to see if all hosts have failed 44842 1727204492.78777: getting the remaining hosts for this loop 44842 1727204492.78779: done getting the remaining hosts for this loop 44842 1727204492.78783: getting the next task for host managed-node1 44842 1727204492.78790: done getting next task for host managed-node1 44842 1727204492.78792: ^ task is: TASK: Include the task 'enable_epel.yml' 44842 1727204492.78795: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.78805: getting variables 44842 1727204492.78807: in VariableManager get_vars() 44842 1727204492.78835: Calling all_inventory to load vars for managed-node1 44842 1727204492.78839: Calling groups_inventory to load vars for managed-node1 44842 1727204492.78843: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.78854: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.78857: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.78863: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.79247: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000c5 44842 1727204492.79251: WORKER PROCESS EXITING 44842 1727204492.79298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.79636: done with get_vars() 44842 1727204492.79647: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.026) 0:00:02.965 ***** 44842 1727204492.79755: entering _queue_task() for managed-node1/include_tasks 44842 1727204492.80000: worker is 1 (out of 1 available) 44842 1727204492.80011: exiting _queue_task() for managed-node1/include_tasks 44842 1727204492.80027: done queuing things up, now waiting for results queue to drain 44842 1727204492.80028: waiting for pending results... 44842 1727204492.80177: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 44842 1727204492.80254: in run() - task 0affcd87-79f5-aad0-d242-0000000000c6 44842 1727204492.80268: variable 'ansible_search_path' from source: unknown 44842 1727204492.80271: variable 'ansible_search_path' from source: unknown 44842 1727204492.80300: calling self._execute() 44842 1727204492.80352: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.80355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.80365: variable 'omit' from source: magic vars 44842 1727204492.80758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204492.82838: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204492.82921: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204492.82965: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204492.83004: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204492.83036: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204492.83116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204492.83151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204492.83187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204492.83230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204492.83251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204492.83367: variable '__network_is_ostree' from source: set_fact 44842 1727204492.83389: Evaluated conditional (not __network_is_ostree | d(false)): True 44842 1727204492.83398: _execute() done 44842 1727204492.83405: dumping result to json 44842 1727204492.83411: done dumping result, returning 44842 1727204492.83419: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-aad0-d242-0000000000c6] 44842 1727204492.83429: sending task result for task 0affcd87-79f5-aad0-d242-0000000000c6 44842 1727204492.83557: no more pending results, returning what we have 44842 1727204492.83562: in VariableManager get_vars() 44842 1727204492.83595: Calling all_inventory to load vars for managed-node1 44842 1727204492.83598: Calling groups_inventory to load vars for managed-node1 44842 1727204492.83601: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.83612: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.83615: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.83618: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.83839: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000c6 44842 1727204492.83842: WORKER PROCESS EXITING 44842 1727204492.83868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.84097: done with get_vars() 44842 1727204492.84113: variable 'ansible_search_path' from source: unknown 44842 1727204492.84115: variable 'ansible_search_path' from source: unknown 44842 1727204492.84150: we have included files to process 44842 1727204492.84152: generating all_blocks data 44842 1727204492.84153: done generating all_blocks data 44842 1727204492.84159: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44842 1727204492.84160: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44842 1727204492.84162: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44842 1727204492.84904: done processing included file 44842 1727204492.84907: iterating over new_blocks loaded from include file 44842 1727204492.84908: in VariableManager get_vars() 44842 1727204492.84920: done with get_vars() 44842 1727204492.84921: filtering new block on tags 44842 1727204492.84943: done filtering new block on tags 44842 1727204492.84946: in VariableManager get_vars() 44842 1727204492.84956: done with get_vars() 44842 1727204492.84958: filtering new block on tags 44842 1727204492.84971: done filtering new block on tags 44842 1727204492.84973: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 44842 1727204492.84984: extending task lists for all hosts with included blocks 44842 1727204492.85104: done extending task lists 44842 1727204492.85105: done processing included files 44842 1727204492.85106: results queue empty 44842 1727204492.85107: checking for any_errors_fatal 44842 1727204492.85110: done checking for any_errors_fatal 44842 1727204492.85111: checking for max_fail_percentage 44842 1727204492.85112: done checking for max_fail_percentage 44842 1727204492.85112: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.85113: done checking to see if all hosts have failed 44842 1727204492.85114: getting the remaining hosts for this loop 44842 1727204492.85115: done getting the remaining hosts for this loop 44842 1727204492.85118: getting the next task for host managed-node1 44842 1727204492.85122: done getting next task for host managed-node1 44842 1727204492.85124: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 44842 1727204492.85127: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.85129: getting variables 44842 1727204492.85130: in VariableManager get_vars() 44842 1727204492.85139: Calling all_inventory to load vars for managed-node1 44842 1727204492.85141: Calling groups_inventory to load vars for managed-node1 44842 1727204492.85143: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.85148: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.85156: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.85159: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.85359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.85588: done with get_vars() 44842 1727204492.85597: done getting variables 44842 1727204492.85671: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 44842 1727204492.85887: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.061) 0:00:03.027 ***** 44842 1727204492.85930: entering _queue_task() for managed-node1/command 44842 1727204492.85932: Creating lock for command 44842 1727204492.86239: worker is 1 (out of 1 available) 44842 1727204492.86250: exiting _queue_task() for managed-node1/command 44842 1727204492.86261: done queuing things up, now waiting for results queue to drain 44842 1727204492.86262: waiting for pending results... 44842 1727204492.86541: running TaskExecutor() for managed-node1/TASK: Create EPEL 9 44842 1727204492.86653: in run() - task 0affcd87-79f5-aad0-d242-0000000000e0 44842 1727204492.86672: variable 'ansible_search_path' from source: unknown 44842 1727204492.86679: variable 'ansible_search_path' from source: unknown 44842 1727204492.86724: calling self._execute() 44842 1727204492.86804: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.86820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.86846: variable 'omit' from source: magic vars 44842 1727204492.87253: variable 'ansible_distribution' from source: facts 44842 1727204492.87280: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44842 1727204492.87425: variable 'ansible_distribution_major_version' from source: facts 44842 1727204492.87436: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44842 1727204492.87444: when evaluation is False, skipping this task 44842 1727204492.87451: _execute() done 44842 1727204492.87457: dumping result to json 44842 1727204492.87469: done dumping result, returning 44842 1727204492.87487: done running TaskExecutor() for managed-node1/TASK: Create EPEL 9 [0affcd87-79f5-aad0-d242-0000000000e0] 44842 1727204492.87504: sending task result for task 0affcd87-79f5-aad0-d242-0000000000e0 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44842 1727204492.87676: no more pending results, returning what we have 44842 1727204492.87680: results queue empty 44842 1727204492.87681: checking for any_errors_fatal 44842 1727204492.87682: done checking for any_errors_fatal 44842 1727204492.87683: checking for max_fail_percentage 44842 1727204492.87684: done checking for max_fail_percentage 44842 1727204492.87685: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.87686: done checking to see if all hosts have failed 44842 1727204492.87687: getting the remaining hosts for this loop 44842 1727204492.87688: done getting the remaining hosts for this loop 44842 1727204492.87693: getting the next task for host managed-node1 44842 1727204492.87701: done getting next task for host managed-node1 44842 1727204492.87703: ^ task is: TASK: Install yum-utils package 44842 1727204492.87707: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.87712: getting variables 44842 1727204492.87714: in VariableManager get_vars() 44842 1727204492.87745: Calling all_inventory to load vars for managed-node1 44842 1727204492.87748: Calling groups_inventory to load vars for managed-node1 44842 1727204492.87752: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.87766: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.87769: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.87773: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.88002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.88236: done with get_vars() 44842 1727204492.88246: done getting variables 44842 1727204492.88417: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000e0 44842 1727204492.88420: WORKER PROCESS EXITING 44842 1727204492.88503: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.027) 0:00:03.054 ***** 44842 1727204492.88650: entering _queue_task() for managed-node1/package 44842 1727204492.88652: Creating lock for package 44842 1727204492.88976: worker is 1 (out of 1 available) 44842 1727204492.88986: exiting _queue_task() for managed-node1/package 44842 1727204492.88997: done queuing things up, now waiting for results queue to drain 44842 1727204492.88998: waiting for pending results... 44842 1727204492.89249: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 44842 1727204492.89372: in run() - task 0affcd87-79f5-aad0-d242-0000000000e1 44842 1727204492.89399: variable 'ansible_search_path' from source: unknown 44842 1727204492.89408: variable 'ansible_search_path' from source: unknown 44842 1727204492.89454: calling self._execute() 44842 1727204492.89542: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.89557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.89573: variable 'omit' from source: magic vars 44842 1727204492.89975: variable 'ansible_distribution' from source: facts 44842 1727204492.89997: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44842 1727204492.90139: variable 'ansible_distribution_major_version' from source: facts 44842 1727204492.90159: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44842 1727204492.90170: when evaluation is False, skipping this task 44842 1727204492.90178: _execute() done 44842 1727204492.90186: dumping result to json 44842 1727204492.90193: done dumping result, returning 44842 1727204492.90207: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [0affcd87-79f5-aad0-d242-0000000000e1] 44842 1727204492.90219: sending task result for task 0affcd87-79f5-aad0-d242-0000000000e1 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44842 1727204492.90375: no more pending results, returning what we have 44842 1727204492.90379: results queue empty 44842 1727204492.90380: checking for any_errors_fatal 44842 1727204492.90390: done checking for any_errors_fatal 44842 1727204492.90391: checking for max_fail_percentage 44842 1727204492.90393: done checking for max_fail_percentage 44842 1727204492.90394: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.90395: done checking to see if all hosts have failed 44842 1727204492.90396: getting the remaining hosts for this loop 44842 1727204492.90397: done getting the remaining hosts for this loop 44842 1727204492.90402: getting the next task for host managed-node1 44842 1727204492.90409: done getting next task for host managed-node1 44842 1727204492.90412: ^ task is: TASK: Enable EPEL 7 44842 1727204492.90416: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.90419: getting variables 44842 1727204492.90422: in VariableManager get_vars() 44842 1727204492.90453: Calling all_inventory to load vars for managed-node1 44842 1727204492.90457: Calling groups_inventory to load vars for managed-node1 44842 1727204492.90460: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.90481: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.90485: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.90489: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.90706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.90973: done with get_vars() 44842 1727204492.90990: done getting variables 44842 1727204492.91174: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000e1 44842 1727204492.91177: WORKER PROCESS EXITING 44842 1727204492.91217: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.027) 0:00:03.081 ***** 44842 1727204492.91372: entering _queue_task() for managed-node1/command 44842 1727204492.91643: worker is 1 (out of 1 available) 44842 1727204492.91654: exiting _queue_task() for managed-node1/command 44842 1727204492.91667: done queuing things up, now waiting for results queue to drain 44842 1727204492.91668: waiting for pending results... 44842 1727204492.91927: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 44842 1727204492.92050: in run() - task 0affcd87-79f5-aad0-d242-0000000000e2 44842 1727204492.92070: variable 'ansible_search_path' from source: unknown 44842 1727204492.92079: variable 'ansible_search_path' from source: unknown 44842 1727204492.92130: calling self._execute() 44842 1727204492.92208: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.92239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.92372: variable 'omit' from source: magic vars 44842 1727204492.92775: variable 'ansible_distribution' from source: facts 44842 1727204492.92804: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44842 1727204492.93015: variable 'ansible_distribution_major_version' from source: facts 44842 1727204492.93048: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44842 1727204492.93056: when evaluation is False, skipping this task 44842 1727204492.93069: _execute() done 44842 1727204492.93078: dumping result to json 44842 1727204492.93085: done dumping result, returning 44842 1727204492.93093: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [0affcd87-79f5-aad0-d242-0000000000e2] 44842 1727204492.93104: sending task result for task 0affcd87-79f5-aad0-d242-0000000000e2 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44842 1727204492.93266: no more pending results, returning what we have 44842 1727204492.93270: results queue empty 44842 1727204492.93271: checking for any_errors_fatal 44842 1727204492.93278: done checking for any_errors_fatal 44842 1727204492.93279: checking for max_fail_percentage 44842 1727204492.93281: done checking for max_fail_percentage 44842 1727204492.93282: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.93283: done checking to see if all hosts have failed 44842 1727204492.93284: getting the remaining hosts for this loop 44842 1727204492.93286: done getting the remaining hosts for this loop 44842 1727204492.93290: getting the next task for host managed-node1 44842 1727204492.93298: done getting next task for host managed-node1 44842 1727204492.93300: ^ task is: TASK: Enable EPEL 8 44842 1727204492.93304: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.93308: getting variables 44842 1727204492.93309: in VariableManager get_vars() 44842 1727204492.93337: Calling all_inventory to load vars for managed-node1 44842 1727204492.93340: Calling groups_inventory to load vars for managed-node1 44842 1727204492.93343: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.93356: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.93359: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.93362: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.93592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.93788: done with get_vars() 44842 1727204492.93806: done getting variables 44842 1727204492.93913: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000e2 44842 1727204492.93916: WORKER PROCESS EXITING 44842 1727204492.93931: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.025) 0:00:03.107 ***** 44842 1727204492.93962: entering _queue_task() for managed-node1/command 44842 1727204492.94388: worker is 1 (out of 1 available) 44842 1727204492.94398: exiting _queue_task() for managed-node1/command 44842 1727204492.94409: done queuing things up, now waiting for results queue to drain 44842 1727204492.94410: waiting for pending results... 44842 1727204492.95205: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 44842 1727204492.95296: in run() - task 0affcd87-79f5-aad0-d242-0000000000e3 44842 1727204492.95305: variable 'ansible_search_path' from source: unknown 44842 1727204492.95308: variable 'ansible_search_path' from source: unknown 44842 1727204492.95344: calling self._execute() 44842 1727204492.95418: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.95424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.95433: variable 'omit' from source: magic vars 44842 1727204492.95908: variable 'ansible_distribution' from source: facts 44842 1727204492.95925: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44842 1727204492.96068: variable 'ansible_distribution_major_version' from source: facts 44842 1727204492.96080: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 44842 1727204492.96087: when evaluation is False, skipping this task 44842 1727204492.96101: _execute() done 44842 1727204492.96114: dumping result to json 44842 1727204492.96122: done dumping result, returning 44842 1727204492.96131: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [0affcd87-79f5-aad0-d242-0000000000e3] 44842 1727204492.96141: sending task result for task 0affcd87-79f5-aad0-d242-0000000000e3 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 44842 1727204492.96296: no more pending results, returning what we have 44842 1727204492.96300: results queue empty 44842 1727204492.96301: checking for any_errors_fatal 44842 1727204492.96308: done checking for any_errors_fatal 44842 1727204492.96309: checking for max_fail_percentage 44842 1727204492.96311: done checking for max_fail_percentage 44842 1727204492.96312: checking to see if all hosts have failed and the running result is not ok 44842 1727204492.96313: done checking to see if all hosts have failed 44842 1727204492.96313: getting the remaining hosts for this loop 44842 1727204492.96315: done getting the remaining hosts for this loop 44842 1727204492.96319: getting the next task for host managed-node1 44842 1727204492.96329: done getting next task for host managed-node1 44842 1727204492.96332: ^ task is: TASK: Enable EPEL 6 44842 1727204492.96336: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204492.96340: getting variables 44842 1727204492.96343: in VariableManager get_vars() 44842 1727204492.96376: Calling all_inventory to load vars for managed-node1 44842 1727204492.96380: Calling groups_inventory to load vars for managed-node1 44842 1727204492.96383: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204492.96396: Calling all_plugins_play to load vars for managed-node1 44842 1727204492.96399: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204492.96402: Calling groups_plugins_play to load vars for managed-node1 44842 1727204492.96609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204492.96845: done with get_vars() 44842 1727204492.96856: done getting variables 44842 1727204492.97016: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000e3 44842 1727204492.97019: WORKER PROCESS EXITING 44842 1727204492.97058: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.031) 0:00:03.139 ***** 44842 1727204492.97096: entering _queue_task() for managed-node1/copy 44842 1727204492.98472: worker is 1 (out of 1 available) 44842 1727204492.98482: exiting _queue_task() for managed-node1/copy 44842 1727204492.98495: done queuing things up, now waiting for results queue to drain 44842 1727204492.98496: waiting for pending results... 44842 1727204492.99003: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 44842 1727204492.99098: in run() - task 0affcd87-79f5-aad0-d242-0000000000e5 44842 1727204492.99110: variable 'ansible_search_path' from source: unknown 44842 1727204492.99114: variable 'ansible_search_path' from source: unknown 44842 1727204492.99149: calling self._execute() 44842 1727204492.99225: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204492.99229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204492.99239: variable 'omit' from source: magic vars 44842 1727204492.99901: variable 'ansible_distribution' from source: facts 44842 1727204492.99912: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 44842 1727204493.00025: variable 'ansible_distribution_major_version' from source: facts 44842 1727204493.00030: Evaluated conditional (ansible_distribution_major_version == '6'): False 44842 1727204493.00033: when evaluation is False, skipping this task 44842 1727204493.00036: _execute() done 44842 1727204493.00040: dumping result to json 44842 1727204493.00042: done dumping result, returning 44842 1727204493.00049: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [0affcd87-79f5-aad0-d242-0000000000e5] 44842 1727204493.00056: sending task result for task 0affcd87-79f5-aad0-d242-0000000000e5 44842 1727204493.00152: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000e5 44842 1727204493.00155: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 44842 1727204493.00218: no more pending results, returning what we have 44842 1727204493.00223: results queue empty 44842 1727204493.00224: checking for any_errors_fatal 44842 1727204493.00230: done checking for any_errors_fatal 44842 1727204493.00230: checking for max_fail_percentage 44842 1727204493.00232: done checking for max_fail_percentage 44842 1727204493.00233: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.00234: done checking to see if all hosts have failed 44842 1727204493.00234: getting the remaining hosts for this loop 44842 1727204493.00236: done getting the remaining hosts for this loop 44842 1727204493.00240: getting the next task for host managed-node1 44842 1727204493.00252: done getting next task for host managed-node1 44842 1727204493.00254: ^ task is: TASK: Set network provider to 'nm' 44842 1727204493.00256: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.00260: getting variables 44842 1727204493.00261: in VariableManager get_vars() 44842 1727204493.00297: Calling all_inventory to load vars for managed-node1 44842 1727204493.00300: Calling groups_inventory to load vars for managed-node1 44842 1727204493.00303: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.00314: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.00317: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.00319: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.00540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.01410: done with get_vars() 44842 1727204493.01420: done getting variables 44842 1727204493.01481: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:13 Tuesday 24 September 2024 15:01:33 -0400 (0:00:00.044) 0:00:03.185 ***** 44842 1727204493.01741: entering _queue_task() for managed-node1/set_fact 44842 1727204493.02458: worker is 1 (out of 1 available) 44842 1727204493.02471: exiting _queue_task() for managed-node1/set_fact 44842 1727204493.02482: done queuing things up, now waiting for results queue to drain 44842 1727204493.02483: waiting for pending results... 44842 1727204493.03640: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 44842 1727204493.03856: in run() - task 0affcd87-79f5-aad0-d242-000000000007 44842 1727204493.03882: variable 'ansible_search_path' from source: unknown 44842 1727204493.03984: calling self._execute() 44842 1727204493.04110: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.04182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.04269: variable 'omit' from source: magic vars 44842 1727204493.04398: variable 'omit' from source: magic vars 44842 1727204493.04435: variable 'omit' from source: magic vars 44842 1727204493.04488: variable 'omit' from source: magic vars 44842 1727204493.04533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204493.04583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204493.04610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204493.04630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204493.04645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204493.04692: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204493.04700: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.04707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.04812: Set connection var ansible_shell_type to sh 44842 1727204493.04827: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204493.04836: Set connection var ansible_connection to ssh 44842 1727204493.04845: Set connection var ansible_pipelining to False 44842 1727204493.04854: Set connection var ansible_timeout to 10 44842 1727204493.04866: Set connection var ansible_shell_executable to /bin/sh 44842 1727204493.04891: variable 'ansible_shell_executable' from source: unknown 44842 1727204493.04906: variable 'ansible_connection' from source: unknown 44842 1727204493.04915: variable 'ansible_module_compression' from source: unknown 44842 1727204493.04921: variable 'ansible_shell_type' from source: unknown 44842 1727204493.04926: variable 'ansible_shell_executable' from source: unknown 44842 1727204493.04932: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.04939: variable 'ansible_pipelining' from source: unknown 44842 1727204493.04944: variable 'ansible_timeout' from source: unknown 44842 1727204493.04951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.05099: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204493.05121: variable 'omit' from source: magic vars 44842 1727204493.05133: starting attempt loop 44842 1727204493.05139: running the handler 44842 1727204493.05154: handler run complete 44842 1727204493.05169: attempt loop complete, returning result 44842 1727204493.05176: _execute() done 44842 1727204493.05183: dumping result to json 44842 1727204493.05189: done dumping result, returning 44842 1727204493.05199: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [0affcd87-79f5-aad0-d242-000000000007] 44842 1727204493.05209: sending task result for task 0affcd87-79f5-aad0-d242-000000000007 44842 1727204493.05316: done sending task result for task 0affcd87-79f5-aad0-d242-000000000007 44842 1727204493.05323: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 44842 1727204493.05394: no more pending results, returning what we have 44842 1727204493.05397: results queue empty 44842 1727204493.05398: checking for any_errors_fatal 44842 1727204493.05405: done checking for any_errors_fatal 44842 1727204493.05405: checking for max_fail_percentage 44842 1727204493.05407: done checking for max_fail_percentage 44842 1727204493.05408: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.05409: done checking to see if all hosts have failed 44842 1727204493.05410: getting the remaining hosts for this loop 44842 1727204493.05412: done getting the remaining hosts for this loop 44842 1727204493.05416: getting the next task for host managed-node1 44842 1727204493.05424: done getting next task for host managed-node1 44842 1727204493.05426: ^ task is: TASK: meta (flush_handlers) 44842 1727204493.05428: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.05433: getting variables 44842 1727204493.05435: in VariableManager get_vars() 44842 1727204493.05473: Calling all_inventory to load vars for managed-node1 44842 1727204493.05477: Calling groups_inventory to load vars for managed-node1 44842 1727204493.05480: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.05491: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.05495: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.05497: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.05714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.05934: done with get_vars() 44842 1727204493.05952: done getting variables 44842 1727204493.06151: in VariableManager get_vars() 44842 1727204493.06166: Calling all_inventory to load vars for managed-node1 44842 1727204493.06169: Calling groups_inventory to load vars for managed-node1 44842 1727204493.06171: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.06176: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.06179: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.06181: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.06536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.06756: done with get_vars() 44842 1727204493.06772: done queuing things up, now waiting for results queue to drain 44842 1727204493.06777: results queue empty 44842 1727204493.06778: checking for any_errors_fatal 44842 1727204493.06781: done checking for any_errors_fatal 44842 1727204493.06781: checking for max_fail_percentage 44842 1727204493.06782: done checking for max_fail_percentage 44842 1727204493.06783: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.06784: done checking to see if all hosts have failed 44842 1727204493.06785: getting the remaining hosts for this loop 44842 1727204493.06789: done getting the remaining hosts for this loop 44842 1727204493.06791: getting the next task for host managed-node1 44842 1727204493.06794: done getting next task for host managed-node1 44842 1727204493.06796: ^ task is: TASK: meta (flush_handlers) 44842 1727204493.06797: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.06804: getting variables 44842 1727204493.06805: in VariableManager get_vars() 44842 1727204493.06813: Calling all_inventory to load vars for managed-node1 44842 1727204493.06815: Calling groups_inventory to load vars for managed-node1 44842 1727204493.06816: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.06820: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.06822: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.06824: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.06965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.07187: done with get_vars() 44842 1727204493.07195: done getting variables 44842 1727204493.07373: in VariableManager get_vars() 44842 1727204493.07382: Calling all_inventory to load vars for managed-node1 44842 1727204493.07385: Calling groups_inventory to load vars for managed-node1 44842 1727204493.07387: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.07392: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.07394: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.07401: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.07597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.07801: done with get_vars() 44842 1727204493.07813: done queuing things up, now waiting for results queue to drain 44842 1727204493.07815: results queue empty 44842 1727204493.07816: checking for any_errors_fatal 44842 1727204493.07817: done checking for any_errors_fatal 44842 1727204493.07818: checking for max_fail_percentage 44842 1727204493.07819: done checking for max_fail_percentage 44842 1727204493.07819: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.07820: done checking to see if all hosts have failed 44842 1727204493.07821: getting the remaining hosts for this loop 44842 1727204493.07822: done getting the remaining hosts for this loop 44842 1727204493.07825: getting the next task for host managed-node1 44842 1727204493.07828: done getting next task for host managed-node1 44842 1727204493.07829: ^ task is: None 44842 1727204493.07830: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.07832: done queuing things up, now waiting for results queue to drain 44842 1727204493.07833: results queue empty 44842 1727204493.07833: checking for any_errors_fatal 44842 1727204493.07834: done checking for any_errors_fatal 44842 1727204493.07835: checking for max_fail_percentage 44842 1727204493.07836: done checking for max_fail_percentage 44842 1727204493.07836: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.07837: done checking to see if all hosts have failed 44842 1727204493.07839: getting the next task for host managed-node1 44842 1727204493.07841: done getting next task for host managed-node1 44842 1727204493.07842: ^ task is: None 44842 1727204493.07843: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.07927: in VariableManager get_vars() 44842 1727204493.07948: done with get_vars() 44842 1727204493.07954: in VariableManager get_vars() 44842 1727204493.07972: done with get_vars() 44842 1727204493.07977: variable 'omit' from source: magic vars 44842 1727204493.08025: in VariableManager get_vars() 44842 1727204493.08050: done with get_vars() 44842 1727204493.08086: variable 'omit' from source: magic vars PLAY [Test for testing routing rules] ****************************************** 44842 1727204493.08712: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44842 1727204493.09224: getting the remaining hosts for this loop 44842 1727204493.09226: done getting the remaining hosts for this loop 44842 1727204493.09229: getting the next task for host managed-node1 44842 1727204493.09239: done getting next task for host managed-node1 44842 1727204493.09241: ^ task is: TASK: Gathering Facts 44842 1727204493.09243: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.09245: getting variables 44842 1727204493.09252: in VariableManager get_vars() 44842 1727204493.09269: Calling all_inventory to load vars for managed-node1 44842 1727204493.09271: Calling groups_inventory to load vars for managed-node1 44842 1727204493.09273: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.09279: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.09292: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.09296: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.09478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.09888: done with get_vars() 44842 1727204493.09896: done getting variables 44842 1727204493.09936: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Tuesday 24 September 2024 15:01:33 -0400 (0:00:00.082) 0:00:03.268 ***** 44842 1727204493.10023: entering _queue_task() for managed-node1/gather_facts 44842 1727204493.10381: worker is 1 (out of 1 available) 44842 1727204493.10399: exiting _queue_task() for managed-node1/gather_facts 44842 1727204493.10410: done queuing things up, now waiting for results queue to drain 44842 1727204493.10412: waiting for pending results... 44842 1727204493.10753: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44842 1727204493.10924: in run() - task 0affcd87-79f5-aad0-d242-00000000010b 44842 1727204493.10945: variable 'ansible_search_path' from source: unknown 44842 1727204493.11001: calling self._execute() 44842 1727204493.11128: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.11140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.11154: variable 'omit' from source: magic vars 44842 1727204493.11581: variable 'ansible_distribution_major_version' from source: facts 44842 1727204493.11601: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204493.11618: variable 'omit' from source: magic vars 44842 1727204493.11652: variable 'omit' from source: magic vars 44842 1727204493.11695: variable 'omit' from source: magic vars 44842 1727204493.11757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204493.12058: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204493.12121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204493.12143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204493.12171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204493.12236: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204493.12250: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.12266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.12405: Set connection var ansible_shell_type to sh 44842 1727204493.12435: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204493.12447: Set connection var ansible_connection to ssh 44842 1727204493.12457: Set connection var ansible_pipelining to False 44842 1727204493.12472: Set connection var ansible_timeout to 10 44842 1727204493.12484: Set connection var ansible_shell_executable to /bin/sh 44842 1727204493.12508: variable 'ansible_shell_executable' from source: unknown 44842 1727204493.12515: variable 'ansible_connection' from source: unknown 44842 1727204493.12529: variable 'ansible_module_compression' from source: unknown 44842 1727204493.12537: variable 'ansible_shell_type' from source: unknown 44842 1727204493.12543: variable 'ansible_shell_executable' from source: unknown 44842 1727204493.12549: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.12556: variable 'ansible_pipelining' from source: unknown 44842 1727204493.12568: variable 'ansible_timeout' from source: unknown 44842 1727204493.12576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.12774: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204493.12791: variable 'omit' from source: magic vars 44842 1727204493.12801: starting attempt loop 44842 1727204493.12808: running the handler 44842 1727204493.12844: variable 'ansible_facts' from source: unknown 44842 1727204493.12899: _low_level_execute_command(): starting 44842 1727204493.12911: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204493.13490: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204493.13517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.13536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.13589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204493.13602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204493.13668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204493.15814: stdout chunk (state=3): >>>/root <<< 44842 1727204493.15959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204493.16033: stderr chunk (state=3): >>><<< 44842 1727204493.16042: stdout chunk (state=3): >>><<< 44842 1727204493.16068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204493.16081: _low_level_execute_command(): starting 44842 1727204493.16087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462 `" && echo ansible-tmp-1727204493.1606681-45090-141092011397462="` echo /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462 `" ) && sleep 0' 44842 1727204493.16758: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204493.16775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.16878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.16990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204493.16994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204493.16997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204493.17056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204493.19621: stdout chunk (state=3): >>>ansible-tmp-1727204493.1606681-45090-141092011397462=/root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462 <<< 44842 1727204493.19785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204493.19846: stderr chunk (state=3): >>><<< 44842 1727204493.19849: stdout chunk (state=3): >>><<< 44842 1727204493.19868: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204493.1606681-45090-141092011397462=/root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204493.19894: variable 'ansible_module_compression' from source: unknown 44842 1727204493.19942: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44842 1727204493.19991: variable 'ansible_facts' from source: unknown 44842 1727204493.20123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462/AnsiballZ_setup.py 44842 1727204493.20323: Sending initial data 44842 1727204493.20342: Sent initial data (154 bytes) 44842 1727204493.21420: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204493.21445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204493.21469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204493.21488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.21530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204493.21555: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204493.21575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.21593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204493.21604: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204493.21614: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204493.21625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204493.21652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204493.21679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.21691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204493.21701: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204493.21714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.21802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204493.21823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204493.21838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204493.21925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204493.24403: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204493.24456: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204493.24516: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpkfpol7zy /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462/AnsiballZ_setup.py <<< 44842 1727204493.24572: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204493.26671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204493.26775: stderr chunk (state=3): >>><<< 44842 1727204493.26778: stdout chunk (state=3): >>><<< 44842 1727204493.26796: done transferring module to remote 44842 1727204493.26805: _low_level_execute_command(): starting 44842 1727204493.26809: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462/ /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462/AnsiballZ_setup.py && sleep 0' 44842 1727204493.27267: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204493.27272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.27307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.27311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204493.27313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.27358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204493.27371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204493.27433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204493.29813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204493.29868: stderr chunk (state=3): >>><<< 44842 1727204493.29871: stdout chunk (state=3): >>><<< 44842 1727204493.29884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204493.29893: _low_level_execute_command(): starting 44842 1727204493.29898: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462/AnsiballZ_setup.py && sleep 0' 44842 1727204493.30351: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204493.30365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.30383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204493.30395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204493.30404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.30450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204493.30467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204493.30534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204493.86056: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmen<<< 44842 1727204493.86063: stdout chunk (state=3): >>>tation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]",<<< 44842 1727204493.86106: stdout chunk (state=3): >>> "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2794, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 738, "free": 2794}, "nocache": {"free": 3270, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sa<<< 44842 1727204493.86134: stdout chunk (state=3): >>>s_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 756, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271749120, "block_size": 4096, "block_total": 65519355, "block_available": 64519470, "block_used": 999885, "inode_total": 131071472, "inode_available": 130998229, "inode_used": 73243, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.47, "5m": 0.45, "15m": 0.28}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "33", "epoch": "1727204493", "epoch_int": "<<< 44842 1727204493.86148: stdout chunk (state=3): >>>1727204493", "date": "2024-09-24", "time": "15:01:33", "iso8601_micro": "2024-09-24T19:01:33.854341Z", "iso8601": "2024-09-24T19:01:33Z", "iso8601_basic": "20240924T150133854341", "iso8601_basic_short": "20240924T150133", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44842 1727204493.88612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204493.88616: stdout chunk (state=3): >>><<< 44842 1727204493.88619: stderr chunk (state=3): >>><<< 44842 1727204493.89072: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2794, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 738, "free": 2794}, "nocache": {"free": 3270, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 756, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271749120, "block_size": 4096, "block_total": 65519355, "block_available": 64519470, "block_used": 999885, "inode_total": 131071472, "inode_available": 130998229, "inode_used": 73243, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.47, "5m": 0.45, "15m": 0.28}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "33", "epoch": "1727204493", "epoch_int": "1727204493", "date": "2024-09-24", "time": "15:01:33", "iso8601_micro": "2024-09-24T19:01:33.854341Z", "iso8601": "2024-09-24T19:01:33Z", "iso8601_basic": "20240924T150133854341", "iso8601_basic_short": "20240924T150133", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204493.89085: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204493.89088: _low_level_execute_command(): starting 44842 1727204493.89091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204493.1606681-45090-141092011397462/ > /dev/null 2>&1 && sleep 0' 44842 1727204493.89767: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204493.89782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204493.89797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204493.90006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.90050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204493.90068: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204493.90084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.90101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204493.90112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204493.90123: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204493.90134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204493.90148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204493.90167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204493.90180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204493.90191: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204493.90204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204493.90284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204493.90305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204493.90321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204493.90415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204493.93018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204493.93022: stdout chunk (state=3): >>><<< 44842 1727204493.93029: stderr chunk (state=3): >>><<< 44842 1727204493.93070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204493.93073: handler run complete 44842 1727204493.93275: variable 'ansible_facts' from source: unknown 44842 1727204493.93340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.93751: variable 'ansible_facts' from source: unknown 44842 1727204493.93867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.94057: attempt loop complete, returning result 44842 1727204493.94069: _execute() done 44842 1727204493.94077: dumping result to json 44842 1727204493.94120: done dumping result, returning 44842 1727204493.94153: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-aad0-d242-00000000010b] 44842 1727204493.94165: sending task result for task 0affcd87-79f5-aad0-d242-00000000010b ok: [managed-node1] 44842 1727204493.95421: no more pending results, returning what we have 44842 1727204493.95423: results queue empty 44842 1727204493.95424: checking for any_errors_fatal 44842 1727204493.95425: done checking for any_errors_fatal 44842 1727204493.95426: checking for max_fail_percentage 44842 1727204493.95427: done checking for max_fail_percentage 44842 1727204493.95428: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.95429: done checking to see if all hosts have failed 44842 1727204493.95430: getting the remaining hosts for this loop 44842 1727204493.95431: done getting the remaining hosts for this loop 44842 1727204493.95434: getting the next task for host managed-node1 44842 1727204493.95439: done getting next task for host managed-node1 44842 1727204493.95441: ^ task is: TASK: meta (flush_handlers) 44842 1727204493.95443: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.95446: getting variables 44842 1727204493.95447: in VariableManager get_vars() 44842 1727204493.95478: Calling all_inventory to load vars for managed-node1 44842 1727204493.95481: Calling groups_inventory to load vars for managed-node1 44842 1727204493.95483: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.95493: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.95495: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.95498: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.95667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.95891: done with get_vars() 44842 1727204493.95908: done getting variables 44842 1727204493.95939: done sending task result for task 0affcd87-79f5-aad0-d242-00000000010b 44842 1727204493.95942: WORKER PROCESS EXITING 44842 1727204493.95984: in VariableManager get_vars() 44842 1727204493.95995: Calling all_inventory to load vars for managed-node1 44842 1727204493.95997: Calling groups_inventory to load vars for managed-node1 44842 1727204493.95999: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.96003: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.96005: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.96019: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.96174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.96407: done with get_vars() 44842 1727204493.96419: done queuing things up, now waiting for results queue to drain 44842 1727204493.96420: results queue empty 44842 1727204493.96421: checking for any_errors_fatal 44842 1727204493.96424: done checking for any_errors_fatal 44842 1727204493.96425: checking for max_fail_percentage 44842 1727204493.96426: done checking for max_fail_percentage 44842 1727204493.96427: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.96427: done checking to see if all hosts have failed 44842 1727204493.96428: getting the remaining hosts for this loop 44842 1727204493.96429: done getting the remaining hosts for this loop 44842 1727204493.96431: getting the next task for host managed-node1 44842 1727204493.96435: done getting next task for host managed-node1 44842 1727204493.96437: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 44842 1727204493.96439: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.96441: getting variables 44842 1727204493.96448: in VariableManager get_vars() 44842 1727204493.96458: Calling all_inventory to load vars for managed-node1 44842 1727204493.96460: Calling groups_inventory to load vars for managed-node1 44842 1727204493.96462: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.96467: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.96470: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.96472: Calling groups_plugins_play to load vars for managed-node1 44842 1727204493.96622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204493.96841: done with get_vars() 44842 1727204493.96848: done getting variables 44842 1727204493.96897: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204493.97033: variable 'type' from source: play vars 44842 1727204493.97038: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:10 Tuesday 24 September 2024 15:01:33 -0400 (0:00:00.870) 0:00:04.138 ***** 44842 1727204493.97079: entering _queue_task() for managed-node1/set_fact 44842 1727204493.97357: worker is 1 (out of 1 available) 44842 1727204493.97368: exiting _queue_task() for managed-node1/set_fact 44842 1727204493.97380: done queuing things up, now waiting for results queue to drain 44842 1727204493.97381: waiting for pending results... 44842 1727204493.97631: running TaskExecutor() for managed-node1/TASK: Set type=veth and interface=ethtest0 44842 1727204493.97742: in run() - task 0affcd87-79f5-aad0-d242-00000000000b 44842 1727204493.97776: variable 'ansible_search_path' from source: unknown 44842 1727204493.97818: calling self._execute() 44842 1727204493.97911: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.97922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.97938: variable 'omit' from source: magic vars 44842 1727204493.98332: variable 'ansible_distribution_major_version' from source: facts 44842 1727204493.98346: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204493.98356: variable 'omit' from source: magic vars 44842 1727204493.98382: variable 'omit' from source: magic vars 44842 1727204493.98417: variable 'type' from source: play vars 44842 1727204493.98490: variable 'type' from source: play vars 44842 1727204493.98502: variable 'interface' from source: play vars 44842 1727204493.98589: variable 'interface' from source: play vars 44842 1727204493.98609: variable 'omit' from source: magic vars 44842 1727204493.98667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204493.98719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204493.98753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204493.98776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204493.98791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204493.98827: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204493.98837: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.98869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.98979: Set connection var ansible_shell_type to sh 44842 1727204493.98997: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204493.99007: Set connection var ansible_connection to ssh 44842 1727204493.99023: Set connection var ansible_pipelining to False 44842 1727204493.99032: Set connection var ansible_timeout to 10 44842 1727204493.99044: Set connection var ansible_shell_executable to /bin/sh 44842 1727204493.99082: variable 'ansible_shell_executable' from source: unknown 44842 1727204493.99090: variable 'ansible_connection' from source: unknown 44842 1727204493.99097: variable 'ansible_module_compression' from source: unknown 44842 1727204493.99104: variable 'ansible_shell_type' from source: unknown 44842 1727204493.99110: variable 'ansible_shell_executable' from source: unknown 44842 1727204493.99116: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204493.99128: variable 'ansible_pipelining' from source: unknown 44842 1727204493.99135: variable 'ansible_timeout' from source: unknown 44842 1727204493.99142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204493.99330: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204493.99397: variable 'omit' from source: magic vars 44842 1727204493.99418: starting attempt loop 44842 1727204493.99425: running the handler 44842 1727204493.99458: handler run complete 44842 1727204493.99475: attempt loop complete, returning result 44842 1727204493.99483: _execute() done 44842 1727204493.99489: dumping result to json 44842 1727204493.99523: done dumping result, returning 44842 1727204493.99535: done running TaskExecutor() for managed-node1/TASK: Set type=veth and interface=ethtest0 [0affcd87-79f5-aad0-d242-00000000000b] 44842 1727204493.99545: sending task result for task 0affcd87-79f5-aad0-d242-00000000000b 44842 1727204493.99656: done sending task result for task 0affcd87-79f5-aad0-d242-00000000000b ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 44842 1727204493.99723: no more pending results, returning what we have 44842 1727204493.99727: results queue empty 44842 1727204493.99728: checking for any_errors_fatal 44842 1727204493.99729: done checking for any_errors_fatal 44842 1727204493.99730: checking for max_fail_percentage 44842 1727204493.99732: done checking for max_fail_percentage 44842 1727204493.99732: checking to see if all hosts have failed and the running result is not ok 44842 1727204493.99733: done checking to see if all hosts have failed 44842 1727204493.99734: getting the remaining hosts for this loop 44842 1727204493.99737: done getting the remaining hosts for this loop 44842 1727204493.99741: getting the next task for host managed-node1 44842 1727204493.99748: done getting next task for host managed-node1 44842 1727204493.99752: ^ task is: TASK: Include the task 'show_interfaces.yml' 44842 1727204493.99755: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204493.99759: getting variables 44842 1727204493.99761: in VariableManager get_vars() 44842 1727204493.99805: Calling all_inventory to load vars for managed-node1 44842 1727204493.99809: Calling groups_inventory to load vars for managed-node1 44842 1727204493.99812: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204493.99824: Calling all_plugins_play to load vars for managed-node1 44842 1727204493.99827: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204493.99831: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.00074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.00384: done with get_vars() 44842 1727204494.00403: done getting variables 44842 1727204494.00448: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:14 Tuesday 24 September 2024 15:01:34 -0400 (0:00:00.035) 0:00:04.174 ***** 44842 1727204494.00590: entering _queue_task() for managed-node1/include_tasks 44842 1727204494.00968: worker is 1 (out of 1 available) 44842 1727204494.00980: exiting _queue_task() for managed-node1/include_tasks 44842 1727204494.00991: done queuing things up, now waiting for results queue to drain 44842 1727204494.00992: waiting for pending results... 44842 1727204494.01252: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 44842 1727204494.01360: in run() - task 0affcd87-79f5-aad0-d242-00000000000c 44842 1727204494.01387: variable 'ansible_search_path' from source: unknown 44842 1727204494.01426: calling self._execute() 44842 1727204494.01519: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.01531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.01549: variable 'omit' from source: magic vars 44842 1727204494.01933: variable 'ansible_distribution_major_version' from source: facts 44842 1727204494.01952: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204494.01963: _execute() done 44842 1727204494.01973: dumping result to json 44842 1727204494.01984: done dumping result, returning 44842 1727204494.01993: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-aad0-d242-00000000000c] 44842 1727204494.02005: sending task result for task 0affcd87-79f5-aad0-d242-00000000000c 44842 1727204494.02133: no more pending results, returning what we have 44842 1727204494.02140: in VariableManager get_vars() 44842 1727204494.02184: Calling all_inventory to load vars for managed-node1 44842 1727204494.02187: Calling groups_inventory to load vars for managed-node1 44842 1727204494.02190: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.02204: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.02207: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.02211: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.02410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.02627: done with get_vars() 44842 1727204494.02635: variable 'ansible_search_path' from source: unknown 44842 1727204494.02650: we have included files to process 44842 1727204494.02652: generating all_blocks data 44842 1727204494.02654: done generating all_blocks data 44842 1727204494.02655: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44842 1727204494.02656: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44842 1727204494.02658: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44842 1727204494.02995: in VariableManager get_vars() 44842 1727204494.03081: done with get_vars() 44842 1727204494.03129: done sending task result for task 0affcd87-79f5-aad0-d242-00000000000c 44842 1727204494.03133: WORKER PROCESS EXITING 44842 1727204494.03287: done processing included file 44842 1727204494.03289: iterating over new_blocks loaded from include file 44842 1727204494.03290: in VariableManager get_vars() 44842 1727204494.03305: done with get_vars() 44842 1727204494.03306: filtering new block on tags 44842 1727204494.03324: done filtering new block on tags 44842 1727204494.03326: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 44842 1727204494.03331: extending task lists for all hosts with included blocks 44842 1727204494.06214: done extending task lists 44842 1727204494.06216: done processing included files 44842 1727204494.06217: results queue empty 44842 1727204494.06218: checking for any_errors_fatal 44842 1727204494.06221: done checking for any_errors_fatal 44842 1727204494.06222: checking for max_fail_percentage 44842 1727204494.06223: done checking for max_fail_percentage 44842 1727204494.06224: checking to see if all hosts have failed and the running result is not ok 44842 1727204494.06225: done checking to see if all hosts have failed 44842 1727204494.06225: getting the remaining hosts for this loop 44842 1727204494.06227: done getting the remaining hosts for this loop 44842 1727204494.06229: getting the next task for host managed-node1 44842 1727204494.06233: done getting next task for host managed-node1 44842 1727204494.06352: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44842 1727204494.06354: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204494.06358: getting variables 44842 1727204494.06359: in VariableManager get_vars() 44842 1727204494.06377: Calling all_inventory to load vars for managed-node1 44842 1727204494.06380: Calling groups_inventory to load vars for managed-node1 44842 1727204494.06382: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.06389: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.06392: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.06395: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.06667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.07090: done with get_vars() 44842 1727204494.07101: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:01:34 -0400 (0:00:00.066) 0:00:04.241 ***** 44842 1727204494.07290: entering _queue_task() for managed-node1/include_tasks 44842 1727204494.08173: worker is 1 (out of 1 available) 44842 1727204494.08185: exiting _queue_task() for managed-node1/include_tasks 44842 1727204494.08271: done queuing things up, now waiting for results queue to drain 44842 1727204494.08273: waiting for pending results... 44842 1727204494.09227: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 44842 1727204494.09729: in run() - task 0affcd87-79f5-aad0-d242-000000000121 44842 1727204494.09788: variable 'ansible_search_path' from source: unknown 44842 1727204494.09799: variable 'ansible_search_path' from source: unknown 44842 1727204494.09841: calling self._execute() 44842 1727204494.10056: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.10076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.10089: variable 'omit' from source: magic vars 44842 1727204494.10593: variable 'ansible_distribution_major_version' from source: facts 44842 1727204494.10614: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204494.10627: _execute() done 44842 1727204494.10636: dumping result to json 44842 1727204494.10643: done dumping result, returning 44842 1727204494.10652: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-aad0-d242-000000000121] 44842 1727204494.10668: sending task result for task 0affcd87-79f5-aad0-d242-000000000121 44842 1727204494.10883: done sending task result for task 0affcd87-79f5-aad0-d242-000000000121 44842 1727204494.10886: WORKER PROCESS EXITING 44842 1727204494.10959: no more pending results, returning what we have 44842 1727204494.10966: in VariableManager get_vars() 44842 1727204494.11008: Calling all_inventory to load vars for managed-node1 44842 1727204494.11010: Calling groups_inventory to load vars for managed-node1 44842 1727204494.11013: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.11025: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.11028: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.11031: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.11225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.11443: done with get_vars() 44842 1727204494.11452: variable 'ansible_search_path' from source: unknown 44842 1727204494.11453: variable 'ansible_search_path' from source: unknown 44842 1727204494.11500: we have included files to process 44842 1727204494.11501: generating all_blocks data 44842 1727204494.11503: done generating all_blocks data 44842 1727204494.11504: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44842 1727204494.11505: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44842 1727204494.11507: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44842 1727204494.12255: done processing included file 44842 1727204494.12257: iterating over new_blocks loaded from include file 44842 1727204494.12259: in VariableManager get_vars() 44842 1727204494.12276: done with get_vars() 44842 1727204494.12278: filtering new block on tags 44842 1727204494.12296: done filtering new block on tags 44842 1727204494.12298: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 44842 1727204494.12303: extending task lists for all hosts with included blocks 44842 1727204494.12777: done extending task lists 44842 1727204494.12779: done processing included files 44842 1727204494.12780: results queue empty 44842 1727204494.12780: checking for any_errors_fatal 44842 1727204494.12784: done checking for any_errors_fatal 44842 1727204494.12784: checking for max_fail_percentage 44842 1727204494.12785: done checking for max_fail_percentage 44842 1727204494.12786: checking to see if all hosts have failed and the running result is not ok 44842 1727204494.12787: done checking to see if all hosts have failed 44842 1727204494.12788: getting the remaining hosts for this loop 44842 1727204494.12789: done getting the remaining hosts for this loop 44842 1727204494.12791: getting the next task for host managed-node1 44842 1727204494.12796: done getting next task for host managed-node1 44842 1727204494.12798: ^ task is: TASK: Gather current interface info 44842 1727204494.12801: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204494.12803: getting variables 44842 1727204494.12804: in VariableManager get_vars() 44842 1727204494.12815: Calling all_inventory to load vars for managed-node1 44842 1727204494.12817: Calling groups_inventory to load vars for managed-node1 44842 1727204494.12819: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.12824: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.12826: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.12829: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.13024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.13245: done with get_vars() 44842 1727204494.13255: done getting variables 44842 1727204494.13304: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:01:34 -0400 (0:00:00.060) 0:00:04.301 ***** 44842 1727204494.13332: entering _queue_task() for managed-node1/command 44842 1727204494.13617: worker is 1 (out of 1 available) 44842 1727204494.13630: exiting _queue_task() for managed-node1/command 44842 1727204494.13644: done queuing things up, now waiting for results queue to drain 44842 1727204494.13646: waiting for pending results... 44842 1727204494.13901: running TaskExecutor() for managed-node1/TASK: Gather current interface info 44842 1727204494.14005: in run() - task 0affcd87-79f5-aad0-d242-0000000001b0 44842 1727204494.14023: variable 'ansible_search_path' from source: unknown 44842 1727204494.14031: variable 'ansible_search_path' from source: unknown 44842 1727204494.14081: calling self._execute() 44842 1727204494.14179: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.14193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.14210: variable 'omit' from source: magic vars 44842 1727204494.14605: variable 'ansible_distribution_major_version' from source: facts 44842 1727204494.14622: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204494.14638: variable 'omit' from source: magic vars 44842 1727204494.14694: variable 'omit' from source: magic vars 44842 1727204494.14737: variable 'omit' from source: magic vars 44842 1727204494.14787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204494.14828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204494.14858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204494.14886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204494.14903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204494.14942: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204494.14951: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.14968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.15076: Set connection var ansible_shell_type to sh 44842 1727204494.15091: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204494.15100: Set connection var ansible_connection to ssh 44842 1727204494.15108: Set connection var ansible_pipelining to False 44842 1727204494.15117: Set connection var ansible_timeout to 10 44842 1727204494.15127: Set connection var ansible_shell_executable to /bin/sh 44842 1727204494.15156: variable 'ansible_shell_executable' from source: unknown 44842 1727204494.15169: variable 'ansible_connection' from source: unknown 44842 1727204494.15181: variable 'ansible_module_compression' from source: unknown 44842 1727204494.15189: variable 'ansible_shell_type' from source: unknown 44842 1727204494.15196: variable 'ansible_shell_executable' from source: unknown 44842 1727204494.15203: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.15211: variable 'ansible_pipelining' from source: unknown 44842 1727204494.15218: variable 'ansible_timeout' from source: unknown 44842 1727204494.15225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.15376: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204494.15394: variable 'omit' from source: magic vars 44842 1727204494.15406: starting attempt loop 44842 1727204494.15412: running the handler 44842 1727204494.15430: _low_level_execute_command(): starting 44842 1727204494.15442: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204494.16228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204494.16243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.16259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.16288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.16330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.16342: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204494.16356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.16382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204494.16398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204494.16410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204494.16424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.16439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.16456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.16473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.16485: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204494.16498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.16583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204494.16608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204494.16630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204494.16731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204494.19070: stdout chunk (state=3): >>>/root <<< 44842 1727204494.19344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204494.19348: stdout chunk (state=3): >>><<< 44842 1727204494.19350: stderr chunk (state=3): >>><<< 44842 1727204494.19472: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204494.19476: _low_level_execute_command(): starting 44842 1727204494.19480: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532 `" && echo ansible-tmp-1727204494.1937642-45141-170527859276532="` echo /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532 `" ) && sleep 0' 44842 1727204494.21678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204494.21699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.21718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.21739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.21798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.21812: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204494.21827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.21846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204494.21863: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204494.21881: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204494.21893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.21905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.21919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.21930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.21942: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204494.21954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.22084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204494.22113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204494.22132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204494.22232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204494.24853: stdout chunk (state=3): >>>ansible-tmp-1727204494.1937642-45141-170527859276532=/root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532 <<< 44842 1727204494.25122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204494.25126: stdout chunk (state=3): >>><<< 44842 1727204494.25129: stderr chunk (state=3): >>><<< 44842 1727204494.25375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204494.1937642-45141-170527859276532=/root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204494.25379: variable 'ansible_module_compression' from source: unknown 44842 1727204494.25381: ANSIBALLZ: Using generic lock for ansible.legacy.command 44842 1727204494.25383: ANSIBALLZ: Acquiring lock 44842 1727204494.25386: ANSIBALLZ: Lock acquired: 140164881036544 44842 1727204494.25388: ANSIBALLZ: Creating module 44842 1727204494.43520: ANSIBALLZ: Writing module into payload 44842 1727204494.43649: ANSIBALLZ: Writing module 44842 1727204494.43685: ANSIBALLZ: Renaming module 44842 1727204494.43703: ANSIBALLZ: Done creating module 44842 1727204494.43731: variable 'ansible_facts' from source: unknown 44842 1727204494.43804: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532/AnsiballZ_command.py 44842 1727204494.44000: Sending initial data 44842 1727204494.44004: Sent initial data (156 bytes) 44842 1727204494.45184: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204494.45200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.45215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.45243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.45291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.45303: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204494.45316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.45338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204494.45355: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204494.45373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204494.45385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.45399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.45414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.45426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.45437: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204494.45455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.45532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204494.45562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204494.45587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204494.45699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204494.48250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204494.48307: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204494.48366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpoj6lmg77 /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532/AnsiballZ_command.py <<< 44842 1727204494.48426: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204494.49616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204494.49873: stderr chunk (state=3): >>><<< 44842 1727204494.49876: stdout chunk (state=3): >>><<< 44842 1727204494.49878: done transferring module to remote 44842 1727204494.49881: _low_level_execute_command(): starting 44842 1727204494.49883: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532/ /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532/AnsiballZ_command.py && sleep 0' 44842 1727204494.50513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204494.50528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.50547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.50568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.50613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.50625: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204494.50639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.50660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204494.50674: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204494.50685: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204494.50695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.50707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.50721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.50731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.50742: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204494.50758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.50836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204494.50854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204494.50873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204494.50969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204494.53402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204494.53494: stderr chunk (state=3): >>><<< 44842 1727204494.53498: stdout chunk (state=3): >>><<< 44842 1727204494.53612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204494.53616: _low_level_execute_command(): starting 44842 1727204494.53619: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532/AnsiballZ_command.py && sleep 0' 44842 1727204494.54251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204494.54276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.54291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.54308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.54349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.54360: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204494.54383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.54399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204494.54410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204494.54420: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204494.54430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.54443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.54457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.54470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204494.54488: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204494.54502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.54580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204494.54608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204494.54624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204494.54725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204494.75924: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:34.753752", "end": "2024-09-24 15:01:34.758044", "delta": "0:00:00.004292", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204494.77666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204494.77671: stdout chunk (state=3): >>><<< 44842 1727204494.77673: stderr chunk (state=3): >>><<< 44842 1727204494.77835: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:34.753752", "end": "2024-09-24 15:01:34.758044", "delta": "0:00:00.004292", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204494.77844: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204494.77846: _low_level_execute_command(): starting 44842 1727204494.77849: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204494.1937642-45141-170527859276532/ > /dev/null 2>&1 && sleep 0' 44842 1727204494.79205: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204494.79209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204494.79253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204494.79257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204494.79263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204494.79269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204494.79434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204494.79503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204494.80280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204494.82087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204494.82149: stderr chunk (state=3): >>><<< 44842 1727204494.82152: stdout chunk (state=3): >>><<< 44842 1727204494.82476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204494.82480: handler run complete 44842 1727204494.82482: Evaluated conditional (False): False 44842 1727204494.82484: attempt loop complete, returning result 44842 1727204494.82486: _execute() done 44842 1727204494.82488: dumping result to json 44842 1727204494.82490: done dumping result, returning 44842 1727204494.82492: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-aad0-d242-0000000001b0] 44842 1727204494.82494: sending task result for task 0affcd87-79f5-aad0-d242-0000000001b0 44842 1727204494.82571: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001b0 44842 1727204494.82574: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004292", "end": "2024-09-24 15:01:34.758044", "rc": 0, "start": "2024-09-24 15:01:34.753752" } STDOUT: bonding_masters eth0 lo rpltstbr 44842 1727204494.82655: no more pending results, returning what we have 44842 1727204494.82659: results queue empty 44842 1727204494.82662: checking for any_errors_fatal 44842 1727204494.82665: done checking for any_errors_fatal 44842 1727204494.82666: checking for max_fail_percentage 44842 1727204494.82667: done checking for max_fail_percentage 44842 1727204494.82668: checking to see if all hosts have failed and the running result is not ok 44842 1727204494.82669: done checking to see if all hosts have failed 44842 1727204494.82670: getting the remaining hosts for this loop 44842 1727204494.82672: done getting the remaining hosts for this loop 44842 1727204494.82676: getting the next task for host managed-node1 44842 1727204494.82681: done getting next task for host managed-node1 44842 1727204494.82684: ^ task is: TASK: Set current_interfaces 44842 1727204494.82688: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204494.82692: getting variables 44842 1727204494.82693: in VariableManager get_vars() 44842 1727204494.82730: Calling all_inventory to load vars for managed-node1 44842 1727204494.82732: Calling groups_inventory to load vars for managed-node1 44842 1727204494.82735: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.82745: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.82747: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.82750: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.82948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.83296: done with get_vars() 44842 1727204494.83376: done getting variables 44842 1727204494.83441: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:01:34 -0400 (0:00:00.701) 0:00:05.002 ***** 44842 1727204494.83479: entering _queue_task() for managed-node1/set_fact 44842 1727204494.83931: worker is 1 (out of 1 available) 44842 1727204494.83944: exiting _queue_task() for managed-node1/set_fact 44842 1727204494.83967: done queuing things up, now waiting for results queue to drain 44842 1727204494.83969: waiting for pending results... 44842 1727204494.84842: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 44842 1727204494.85069: in run() - task 0affcd87-79f5-aad0-d242-0000000001b1 44842 1727204494.85089: variable 'ansible_search_path' from source: unknown 44842 1727204494.85098: variable 'ansible_search_path' from source: unknown 44842 1727204494.85253: calling self._execute() 44842 1727204494.85421: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.85847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.85866: variable 'omit' from source: magic vars 44842 1727204494.86250: variable 'ansible_distribution_major_version' from source: facts 44842 1727204494.86273: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204494.86286: variable 'omit' from source: magic vars 44842 1727204494.86354: variable 'omit' from source: magic vars 44842 1727204494.86482: variable '_current_interfaces' from source: set_fact 44842 1727204494.86553: variable 'omit' from source: magic vars 44842 1727204494.86609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204494.86670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204494.86703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204494.86725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204494.86748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204494.86798: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204494.86807: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.86815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.86937: Set connection var ansible_shell_type to sh 44842 1727204494.86952: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204494.86970: Set connection var ansible_connection to ssh 44842 1727204494.86986: Set connection var ansible_pipelining to False 44842 1727204494.86995: Set connection var ansible_timeout to 10 44842 1727204494.87005: Set connection var ansible_shell_executable to /bin/sh 44842 1727204494.87031: variable 'ansible_shell_executable' from source: unknown 44842 1727204494.87039: variable 'ansible_connection' from source: unknown 44842 1727204494.87045: variable 'ansible_module_compression' from source: unknown 44842 1727204494.87051: variable 'ansible_shell_type' from source: unknown 44842 1727204494.87056: variable 'ansible_shell_executable' from source: unknown 44842 1727204494.87070: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.87080: variable 'ansible_pipelining' from source: unknown 44842 1727204494.87092: variable 'ansible_timeout' from source: unknown 44842 1727204494.87100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.87245: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204494.87265: variable 'omit' from source: magic vars 44842 1727204494.87278: starting attempt loop 44842 1727204494.87291: running the handler 44842 1727204494.87316: handler run complete 44842 1727204494.87332: attempt loop complete, returning result 44842 1727204494.87340: _execute() done 44842 1727204494.87348: dumping result to json 44842 1727204494.87356: done dumping result, returning 44842 1727204494.87374: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-aad0-d242-0000000001b1] 44842 1727204494.87384: sending task result for task 0affcd87-79f5-aad0-d242-0000000001b1 44842 1727204494.87499: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001b1 44842 1727204494.87511: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 44842 1727204494.87647: no more pending results, returning what we have 44842 1727204494.87651: results queue empty 44842 1727204494.87652: checking for any_errors_fatal 44842 1727204494.87658: done checking for any_errors_fatal 44842 1727204494.87658: checking for max_fail_percentage 44842 1727204494.87662: done checking for max_fail_percentage 44842 1727204494.87665: checking to see if all hosts have failed and the running result is not ok 44842 1727204494.87666: done checking to see if all hosts have failed 44842 1727204494.87667: getting the remaining hosts for this loop 44842 1727204494.87669: done getting the remaining hosts for this loop 44842 1727204494.87673: getting the next task for host managed-node1 44842 1727204494.87681: done getting next task for host managed-node1 44842 1727204494.87683: ^ task is: TASK: Show current_interfaces 44842 1727204494.87686: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204494.87690: getting variables 44842 1727204494.87691: in VariableManager get_vars() 44842 1727204494.87721: Calling all_inventory to load vars for managed-node1 44842 1727204494.87724: Calling groups_inventory to load vars for managed-node1 44842 1727204494.87726: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.87736: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.87739: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.87742: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.87926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.88171: done with get_vars() 44842 1727204494.88182: done getting variables 44842 1727204494.88408: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:01:34 -0400 (0:00:00.050) 0:00:05.053 ***** 44842 1727204494.88551: entering _queue_task() for managed-node1/debug 44842 1727204494.88553: Creating lock for debug 44842 1727204494.88851: worker is 1 (out of 1 available) 44842 1727204494.88866: exiting _queue_task() for managed-node1/debug 44842 1727204494.88876: done queuing things up, now waiting for results queue to drain 44842 1727204494.88877: waiting for pending results... 44842 1727204494.89120: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 44842 1727204494.89224: in run() - task 0affcd87-79f5-aad0-d242-000000000122 44842 1727204494.89243: variable 'ansible_search_path' from source: unknown 44842 1727204494.89251: variable 'ansible_search_path' from source: unknown 44842 1727204494.90076: calling self._execute() 44842 1727204494.90171: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.90324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.90338: variable 'omit' from source: magic vars 44842 1727204494.91039: variable 'ansible_distribution_major_version' from source: facts 44842 1727204494.91055: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204494.91072: variable 'omit' from source: magic vars 44842 1727204494.91232: variable 'omit' from source: magic vars 44842 1727204494.91344: variable 'current_interfaces' from source: set_fact 44842 1727204494.91439: variable 'omit' from source: magic vars 44842 1727204494.91562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204494.91606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204494.91694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204494.91755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204494.91776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204494.91882: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204494.91891: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.91899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.92122: Set connection var ansible_shell_type to sh 44842 1727204494.92138: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204494.92182: Set connection var ansible_connection to ssh 44842 1727204494.92194: Set connection var ansible_pipelining to False 44842 1727204494.92204: Set connection var ansible_timeout to 10 44842 1727204494.92283: Set connection var ansible_shell_executable to /bin/sh 44842 1727204494.92313: variable 'ansible_shell_executable' from source: unknown 44842 1727204494.92321: variable 'ansible_connection' from source: unknown 44842 1727204494.92328: variable 'ansible_module_compression' from source: unknown 44842 1727204494.92334: variable 'ansible_shell_type' from source: unknown 44842 1727204494.92341: variable 'ansible_shell_executable' from source: unknown 44842 1727204494.92347: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.92354: variable 'ansible_pipelining' from source: unknown 44842 1727204494.92373: variable 'ansible_timeout' from source: unknown 44842 1727204494.92381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.92686: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204494.92730: variable 'omit' from source: magic vars 44842 1727204494.92775: starting attempt loop 44842 1727204494.92782: running the handler 44842 1727204494.92948: handler run complete 44842 1727204494.92972: attempt loop complete, returning result 44842 1727204494.92980: _execute() done 44842 1727204494.92986: dumping result to json 44842 1727204494.92993: done dumping result, returning 44842 1727204494.93003: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-aad0-d242-000000000122] 44842 1727204494.93011: sending task result for task 0affcd87-79f5-aad0-d242-000000000122 44842 1727204494.93117: done sending task result for task 0affcd87-79f5-aad0-d242-000000000122 44842 1727204494.93122: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 44842 1727204494.93195: no more pending results, returning what we have 44842 1727204494.93199: results queue empty 44842 1727204494.93200: checking for any_errors_fatal 44842 1727204494.93206: done checking for any_errors_fatal 44842 1727204494.93206: checking for max_fail_percentage 44842 1727204494.93208: done checking for max_fail_percentage 44842 1727204494.93209: checking to see if all hosts have failed and the running result is not ok 44842 1727204494.93210: done checking to see if all hosts have failed 44842 1727204494.93211: getting the remaining hosts for this loop 44842 1727204494.93212: done getting the remaining hosts for this loop 44842 1727204494.93218: getting the next task for host managed-node1 44842 1727204494.93227: done getting next task for host managed-node1 44842 1727204494.93230: ^ task is: TASK: Include the task 'manage_test_interface.yml' 44842 1727204494.93232: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204494.93236: getting variables 44842 1727204494.93238: in VariableManager get_vars() 44842 1727204494.93281: Calling all_inventory to load vars for managed-node1 44842 1727204494.93284: Calling groups_inventory to load vars for managed-node1 44842 1727204494.93287: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.93298: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.93300: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.93303: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.93509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.93756: done with get_vars() 44842 1727204494.93772: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:16 Tuesday 24 September 2024 15:01:34 -0400 (0:00:00.055) 0:00:05.109 ***** 44842 1727204494.94091: entering _queue_task() for managed-node1/include_tasks 44842 1727204494.94786: worker is 1 (out of 1 available) 44842 1727204494.94799: exiting _queue_task() for managed-node1/include_tasks 44842 1727204494.94809: done queuing things up, now waiting for results queue to drain 44842 1727204494.94810: waiting for pending results... 44842 1727204494.95393: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 44842 1727204494.96150: in run() - task 0affcd87-79f5-aad0-d242-00000000000d 44842 1727204494.96170: variable 'ansible_search_path' from source: unknown 44842 1727204494.96210: calling self._execute() 44842 1727204494.96293: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204494.96304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204494.96319: variable 'omit' from source: magic vars 44842 1727204494.96646: variable 'ansible_distribution_major_version' from source: facts 44842 1727204494.96666: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204494.96678: _execute() done 44842 1727204494.96687: dumping result to json 44842 1727204494.96694: done dumping result, returning 44842 1727204494.96703: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-aad0-d242-00000000000d] 44842 1727204494.96714: sending task result for task 0affcd87-79f5-aad0-d242-00000000000d 44842 1727204494.97494: no more pending results, returning what we have 44842 1727204494.97499: in VariableManager get_vars() 44842 1727204494.97542: Calling all_inventory to load vars for managed-node1 44842 1727204494.97545: Calling groups_inventory to load vars for managed-node1 44842 1727204494.97547: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204494.97567: Calling all_plugins_play to load vars for managed-node1 44842 1727204494.97570: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204494.97574: Calling groups_plugins_play to load vars for managed-node1 44842 1727204494.97747: done sending task result for task 0affcd87-79f5-aad0-d242-00000000000d 44842 1727204494.97752: WORKER PROCESS EXITING 44842 1727204494.97772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204494.98003: done with get_vars() 44842 1727204494.98012: variable 'ansible_search_path' from source: unknown 44842 1727204494.98025: we have included files to process 44842 1727204494.98026: generating all_blocks data 44842 1727204494.98028: done generating all_blocks data 44842 1727204494.98034: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 44842 1727204494.98035: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 44842 1727204494.98039: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 44842 1727204494.99213: in VariableManager get_vars() 44842 1727204494.99234: done with get_vars() 44842 1727204494.99706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44842 1727204495.01246: done processing included file 44842 1727204495.01252: iterating over new_blocks loaded from include file 44842 1727204495.01253: in VariableManager get_vars() 44842 1727204495.01273: done with get_vars() 44842 1727204495.01275: filtering new block on tags 44842 1727204495.01307: done filtering new block on tags 44842 1727204495.01310: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 44842 1727204495.01315: extending task lists for all hosts with included blocks 44842 1727204495.04154: done extending task lists 44842 1727204495.04156: done processing included files 44842 1727204495.04156: results queue empty 44842 1727204495.04157: checking for any_errors_fatal 44842 1727204495.04163: done checking for any_errors_fatal 44842 1727204495.04165: checking for max_fail_percentage 44842 1727204495.04166: done checking for max_fail_percentage 44842 1727204495.04167: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.04168: done checking to see if all hosts have failed 44842 1727204495.04169: getting the remaining hosts for this loop 44842 1727204495.04170: done getting the remaining hosts for this loop 44842 1727204495.04173: getting the next task for host managed-node1 44842 1727204495.04177: done getting next task for host managed-node1 44842 1727204495.04179: ^ task is: TASK: Ensure state in ["present", "absent"] 44842 1727204495.04182: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.04184: getting variables 44842 1727204495.04185: in VariableManager get_vars() 44842 1727204495.04311: Calling all_inventory to load vars for managed-node1 44842 1727204495.04314: Calling groups_inventory to load vars for managed-node1 44842 1727204495.04317: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.04323: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.04326: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.04328: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.04612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.05057: done with get_vars() 44842 1727204495.05073: done getting variables 44842 1727204495.05259: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.111) 0:00:05.221 ***** 44842 1727204495.05368: entering _queue_task() for managed-node1/fail 44842 1727204495.05371: Creating lock for fail 44842 1727204495.05930: worker is 1 (out of 1 available) 44842 1727204495.06057: exiting _queue_task() for managed-node1/fail 44842 1727204495.06074: done queuing things up, now waiting for results queue to drain 44842 1727204495.06076: waiting for pending results... 44842 1727204495.06981: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 44842 1727204495.07094: in run() - task 0affcd87-79f5-aad0-d242-0000000001cc 44842 1727204495.07169: variable 'ansible_search_path' from source: unknown 44842 1727204495.07268: variable 'ansible_search_path' from source: unknown 44842 1727204495.07307: calling self._execute() 44842 1727204495.07607: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.07617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.07630: variable 'omit' from source: magic vars 44842 1727204495.07973: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.08684: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.08830: variable 'state' from source: include params 44842 1727204495.08842: Evaluated conditional (state not in ["present", "absent"]): False 44842 1727204495.08850: when evaluation is False, skipping this task 44842 1727204495.08858: _execute() done 44842 1727204495.08867: dumping result to json 44842 1727204495.08875: done dumping result, returning 44842 1727204495.08885: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-aad0-d242-0000000001cc] 44842 1727204495.08895: sending task result for task 0affcd87-79f5-aad0-d242-0000000001cc 44842 1727204495.09004: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001cc skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 44842 1727204495.09054: no more pending results, returning what we have 44842 1727204495.09058: results queue empty 44842 1727204495.09061: checking for any_errors_fatal 44842 1727204495.09063: done checking for any_errors_fatal 44842 1727204495.09066: checking for max_fail_percentage 44842 1727204495.09067: done checking for max_fail_percentage 44842 1727204495.09068: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.09069: done checking to see if all hosts have failed 44842 1727204495.09070: getting the remaining hosts for this loop 44842 1727204495.09071: done getting the remaining hosts for this loop 44842 1727204495.09075: getting the next task for host managed-node1 44842 1727204495.09081: done getting next task for host managed-node1 44842 1727204495.09084: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 44842 1727204495.09087: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.09090: getting variables 44842 1727204495.09092: in VariableManager get_vars() 44842 1727204495.09130: Calling all_inventory to load vars for managed-node1 44842 1727204495.09133: Calling groups_inventory to load vars for managed-node1 44842 1727204495.09135: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.09149: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.09152: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.09155: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.09336: WORKER PROCESS EXITING 44842 1727204495.09350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.10534: done with get_vars() 44842 1727204495.10545: done getting variables 44842 1727204495.10605: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.053) 0:00:05.274 ***** 44842 1727204495.10632: entering _queue_task() for managed-node1/fail 44842 1727204495.10898: worker is 1 (out of 1 available) 44842 1727204495.10909: exiting _queue_task() for managed-node1/fail 44842 1727204495.10920: done queuing things up, now waiting for results queue to drain 44842 1727204495.10921: waiting for pending results... 44842 1727204495.11897: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 44842 1727204495.12113: in run() - task 0affcd87-79f5-aad0-d242-0000000001cd 44842 1727204495.12131: variable 'ansible_search_path' from source: unknown 44842 1727204495.12137: variable 'ansible_search_path' from source: unknown 44842 1727204495.12190: calling self._execute() 44842 1727204495.12378: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.12389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.12521: variable 'omit' from source: magic vars 44842 1727204495.13392: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.13416: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.13830: variable 'type' from source: set_fact 44842 1727204495.13846: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 44842 1727204495.13854: when evaluation is False, skipping this task 44842 1727204495.13869: _execute() done 44842 1727204495.13879: dumping result to json 44842 1727204495.13890: done dumping result, returning 44842 1727204495.13899: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-aad0-d242-0000000001cd] 44842 1727204495.13910: sending task result for task 0affcd87-79f5-aad0-d242-0000000001cd skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 44842 1727204495.14069: no more pending results, returning what we have 44842 1727204495.14073: results queue empty 44842 1727204495.14075: checking for any_errors_fatal 44842 1727204495.14082: done checking for any_errors_fatal 44842 1727204495.14083: checking for max_fail_percentage 44842 1727204495.14085: done checking for max_fail_percentage 44842 1727204495.14086: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.14087: done checking to see if all hosts have failed 44842 1727204495.14088: getting the remaining hosts for this loop 44842 1727204495.14090: done getting the remaining hosts for this loop 44842 1727204495.14094: getting the next task for host managed-node1 44842 1727204495.14102: done getting next task for host managed-node1 44842 1727204495.14105: ^ task is: TASK: Include the task 'show_interfaces.yml' 44842 1727204495.14109: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.14114: getting variables 44842 1727204495.14115: in VariableManager get_vars() 44842 1727204495.14155: Calling all_inventory to load vars for managed-node1 44842 1727204495.14158: Calling groups_inventory to load vars for managed-node1 44842 1727204495.14161: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.14178: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.14181: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.14184: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.14378: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001cd 44842 1727204495.14383: WORKER PROCESS EXITING 44842 1727204495.14398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.14603: done with get_vars() 44842 1727204495.14614: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.040) 0:00:05.315 ***** 44842 1727204495.14698: entering _queue_task() for managed-node1/include_tasks 44842 1727204495.15660: worker is 1 (out of 1 available) 44842 1727204495.15673: exiting _queue_task() for managed-node1/include_tasks 44842 1727204495.15686: done queuing things up, now waiting for results queue to drain 44842 1727204495.15688: waiting for pending results... 44842 1727204495.16547: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 44842 1727204495.16834: in run() - task 0affcd87-79f5-aad0-d242-0000000001ce 44842 1727204495.16857: variable 'ansible_search_path' from source: unknown 44842 1727204495.16886: variable 'ansible_search_path' from source: unknown 44842 1727204495.16977: calling self._execute() 44842 1727204495.17136: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.17272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.17289: variable 'omit' from source: magic vars 44842 1727204495.18298: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.18321: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.18382: _execute() done 44842 1727204495.18392: dumping result to json 44842 1727204495.18400: done dumping result, returning 44842 1727204495.18472: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-aad0-d242-0000000001ce] 44842 1727204495.18490: sending task result for task 0affcd87-79f5-aad0-d242-0000000001ce 44842 1727204495.18598: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001ce 44842 1727204495.18628: no more pending results, returning what we have 44842 1727204495.18633: in VariableManager get_vars() 44842 1727204495.18678: Calling all_inventory to load vars for managed-node1 44842 1727204495.18682: Calling groups_inventory to load vars for managed-node1 44842 1727204495.18684: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.18699: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.18703: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.18707: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.18957: WORKER PROCESS EXITING 44842 1727204495.18978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.19216: done with get_vars() 44842 1727204495.19224: variable 'ansible_search_path' from source: unknown 44842 1727204495.19226: variable 'ansible_search_path' from source: unknown 44842 1727204495.19477: we have included files to process 44842 1727204495.19478: generating all_blocks data 44842 1727204495.19481: done generating all_blocks data 44842 1727204495.19485: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44842 1727204495.19486: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44842 1727204495.19489: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44842 1727204495.19712: in VariableManager get_vars() 44842 1727204495.19735: done with get_vars() 44842 1727204495.20122: done processing included file 44842 1727204495.20124: iterating over new_blocks loaded from include file 44842 1727204495.20125: in VariableManager get_vars() 44842 1727204495.20143: done with get_vars() 44842 1727204495.20144: filtering new block on tags 44842 1727204495.20285: done filtering new block on tags 44842 1727204495.20288: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 44842 1727204495.20294: extending task lists for all hosts with included blocks 44842 1727204495.22015: done extending task lists 44842 1727204495.22017: done processing included files 44842 1727204495.22018: results queue empty 44842 1727204495.22018: checking for any_errors_fatal 44842 1727204495.22021: done checking for any_errors_fatal 44842 1727204495.22022: checking for max_fail_percentage 44842 1727204495.22023: done checking for max_fail_percentage 44842 1727204495.22024: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.22025: done checking to see if all hosts have failed 44842 1727204495.22025: getting the remaining hosts for this loop 44842 1727204495.22026: done getting the remaining hosts for this loop 44842 1727204495.22029: getting the next task for host managed-node1 44842 1727204495.22033: done getting next task for host managed-node1 44842 1727204495.22035: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44842 1727204495.22037: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.22041: getting variables 44842 1727204495.22042: in VariableManager get_vars() 44842 1727204495.22054: Calling all_inventory to load vars for managed-node1 44842 1727204495.22056: Calling groups_inventory to load vars for managed-node1 44842 1727204495.22058: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.22068: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.22071: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.22073: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.22688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.23518: done with get_vars() 44842 1727204495.23530: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.089) 0:00:05.404 ***** 44842 1727204495.23613: entering _queue_task() for managed-node1/include_tasks 44842 1727204495.24257: worker is 1 (out of 1 available) 44842 1727204495.24273: exiting _queue_task() for managed-node1/include_tasks 44842 1727204495.24381: done queuing things up, now waiting for results queue to drain 44842 1727204495.24382: waiting for pending results... 44842 1727204495.24913: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 44842 1727204495.25004: in run() - task 0affcd87-79f5-aad0-d242-000000000275 44842 1727204495.25014: variable 'ansible_search_path' from source: unknown 44842 1727204495.25018: variable 'ansible_search_path' from source: unknown 44842 1727204495.25054: calling self._execute() 44842 1727204495.25139: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.25144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.25153: variable 'omit' from source: magic vars 44842 1727204495.26214: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.26226: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.26234: _execute() done 44842 1727204495.26237: dumping result to json 44842 1727204495.26240: done dumping result, returning 44842 1727204495.26246: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-aad0-d242-000000000275] 44842 1727204495.26252: sending task result for task 0affcd87-79f5-aad0-d242-000000000275 44842 1727204495.26342: done sending task result for task 0affcd87-79f5-aad0-d242-000000000275 44842 1727204495.26345: WORKER PROCESS EXITING 44842 1727204495.26376: no more pending results, returning what we have 44842 1727204495.26381: in VariableManager get_vars() 44842 1727204495.26422: Calling all_inventory to load vars for managed-node1 44842 1727204495.26424: Calling groups_inventory to load vars for managed-node1 44842 1727204495.26426: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.26440: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.26443: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.26446: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.26641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.26889: done with get_vars() 44842 1727204495.26898: variable 'ansible_search_path' from source: unknown 44842 1727204495.26899: variable 'ansible_search_path' from source: unknown 44842 1727204495.26956: we have included files to process 44842 1727204495.26958: generating all_blocks data 44842 1727204495.26959: done generating all_blocks data 44842 1727204495.26965: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44842 1727204495.26966: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44842 1727204495.26968: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44842 1727204495.27603: done processing included file 44842 1727204495.27605: iterating over new_blocks loaded from include file 44842 1727204495.27607: in VariableManager get_vars() 44842 1727204495.27626: done with get_vars() 44842 1727204495.27628: filtering new block on tags 44842 1727204495.27768: done filtering new block on tags 44842 1727204495.27771: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 44842 1727204495.27777: extending task lists for all hosts with included blocks 44842 1727204495.28072: done extending task lists 44842 1727204495.28074: done processing included files 44842 1727204495.28189: results queue empty 44842 1727204495.28190: checking for any_errors_fatal 44842 1727204495.28194: done checking for any_errors_fatal 44842 1727204495.28194: checking for max_fail_percentage 44842 1727204495.28196: done checking for max_fail_percentage 44842 1727204495.28197: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.28198: done checking to see if all hosts have failed 44842 1727204495.28198: getting the remaining hosts for this loop 44842 1727204495.28200: done getting the remaining hosts for this loop 44842 1727204495.28203: getting the next task for host managed-node1 44842 1727204495.28208: done getting next task for host managed-node1 44842 1727204495.28210: ^ task is: TASK: Gather current interface info 44842 1727204495.28214: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.28217: getting variables 44842 1727204495.28218: in VariableManager get_vars() 44842 1727204495.28230: Calling all_inventory to load vars for managed-node1 44842 1727204495.28232: Calling groups_inventory to load vars for managed-node1 44842 1727204495.28234: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.28239: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.28242: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.28244: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.28627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.28870: done with get_vars() 44842 1727204495.28880: done getting variables 44842 1727204495.28919: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.053) 0:00:05.457 ***** 44842 1727204495.28956: entering _queue_task() for managed-node1/command 44842 1727204495.29290: worker is 1 (out of 1 available) 44842 1727204495.29302: exiting _queue_task() for managed-node1/command 44842 1727204495.29316: done queuing things up, now waiting for results queue to drain 44842 1727204495.29318: waiting for pending results... 44842 1727204495.29697: running TaskExecutor() for managed-node1/TASK: Gather current interface info 44842 1727204495.29832: in run() - task 0affcd87-79f5-aad0-d242-0000000002ac 44842 1727204495.29849: variable 'ansible_search_path' from source: unknown 44842 1727204495.29856: variable 'ansible_search_path' from source: unknown 44842 1727204495.29903: calling self._execute() 44842 1727204495.30001: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.30011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.30024: variable 'omit' from source: magic vars 44842 1727204495.30433: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.30452: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.30470: variable 'omit' from source: magic vars 44842 1727204495.30546: variable 'omit' from source: magic vars 44842 1727204495.30604: variable 'omit' from source: magic vars 44842 1727204495.30654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204495.30705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204495.30729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204495.30754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.30773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.30813: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204495.30821: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.30828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.30937: Set connection var ansible_shell_type to sh 44842 1727204495.30952: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204495.30964: Set connection var ansible_connection to ssh 44842 1727204495.30981: Set connection var ansible_pipelining to False 44842 1727204495.30991: Set connection var ansible_timeout to 10 44842 1727204495.31002: Set connection var ansible_shell_executable to /bin/sh 44842 1727204495.31033: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.31040: variable 'ansible_connection' from source: unknown 44842 1727204495.31047: variable 'ansible_module_compression' from source: unknown 44842 1727204495.31052: variable 'ansible_shell_type' from source: unknown 44842 1727204495.31058: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.31069: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.31082: variable 'ansible_pipelining' from source: unknown 44842 1727204495.31089: variable 'ansible_timeout' from source: unknown 44842 1727204495.31097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.31259: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204495.31279: variable 'omit' from source: magic vars 44842 1727204495.31288: starting attempt loop 44842 1727204495.31298: running the handler 44842 1727204495.31315: _low_level_execute_command(): starting 44842 1727204495.31325: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204495.32244: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204495.32278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.32297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.32318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.32374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.32388: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204495.32406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.32425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204495.32437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204495.32450: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204495.32474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.32488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.32506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.32517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.32527: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204495.32540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.32638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204495.32667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.32690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.32795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204495.34405: stdout chunk (state=3): >>>/root <<< 44842 1727204495.34599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204495.34605: stdout chunk (state=3): >>><<< 44842 1727204495.34608: stderr chunk (state=3): >>><<< 44842 1727204495.34732: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204495.34736: _low_level_execute_command(): starting 44842 1727204495.34739: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034 `" && echo ansible-tmp-1727204495.3463354-45187-97328350932034="` echo /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034 `" ) && sleep 0' 44842 1727204495.36297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204495.36369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.36384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.36400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.36441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.36474: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204495.36488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.36504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204495.36514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204495.36524: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204495.36582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.36596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.36609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.36618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.36627: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204495.36637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.36824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204495.36847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.36866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.36984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204495.38823: stdout chunk (state=3): >>>ansible-tmp-1727204495.3463354-45187-97328350932034=/root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034 <<< 44842 1727204495.39041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204495.39045: stdout chunk (state=3): >>><<< 44842 1727204495.39047: stderr chunk (state=3): >>><<< 44842 1727204495.39273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204495.3463354-45187-97328350932034=/root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 44842 1727204495.39277: variable 'ansible_module_compression' from source: unknown 44842 1727204495.39280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204495.39282: variable 'ansible_facts' from source: unknown 44842 1727204495.39312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034/AnsiballZ_command.py 44842 1727204495.39988: Sending initial data 44842 1727204495.39991: Sent initial data (155 bytes) 44842 1727204495.43137: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204495.43165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.43182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.43202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.43250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.43272: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204495.43288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.43313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204495.43326: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204495.43343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204495.43363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.43387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.43406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.43420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.43436: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204495.43451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.43534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204495.43556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.43578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.43673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 44842 1727204495.45551: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204495.45594: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204495.45662: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp2l6dza5s /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034/AnsiballZ_command.py <<< 44842 1727204495.45706: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204495.46975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204495.47189: stderr chunk (state=3): >>><<< 44842 1727204495.47193: stdout chunk (state=3): >>><<< 44842 1727204495.47195: done transferring module to remote 44842 1727204495.47202: _low_level_execute_command(): starting 44842 1727204495.47205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034/ /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034/AnsiballZ_command.py && sleep 0' 44842 1727204495.47886: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204495.47902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.47917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.47935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.47988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.48001: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204495.48015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.48034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204495.48067: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204495.48081: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204495.48099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.48114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.48137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.48150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.48176: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204495.48197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.48310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204495.48331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.48347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.48450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204495.50181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204495.50301: stderr chunk (state=3): >>><<< 44842 1727204495.50305: stdout chunk (state=3): >>><<< 44842 1727204495.50405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204495.50409: _low_level_execute_command(): starting 44842 1727204495.50413: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034/AnsiballZ_command.py && sleep 0' 44842 1727204495.51770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.51774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.51801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.51805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.51807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.51898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.51901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.51974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204495.72073: stdout chunk (state=3): >>> <<< 44842 1727204495.72082: stdout chunk (state=3): >>>{"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:35.715010", "end": "2024-09-24 15:01:35.719852", "delta": "0:00:00.004842", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204495.73624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204495.73684: stderr chunk (state=3): >>><<< 44842 1727204495.73690: stdout chunk (state=3): >>><<< 44842 1727204495.73706: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:35.715010", "end": "2024-09-24 15:01:35.719852", "delta": "0:00:00.004842", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204495.73742: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204495.73746: _low_level_execute_command(): starting 44842 1727204495.73751: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204495.3463354-45187-97328350932034/ > /dev/null 2>&1 && sleep 0' 44842 1727204495.74208: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.74212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.74274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.74278: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.74280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.74282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.74284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.74286: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.74340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204495.74343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.74345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.74410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204495.76919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204495.76923: stderr chunk (state=3): >>><<< 44842 1727204495.76928: stdout chunk (state=3): >>><<< 44842 1727204495.76947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204495.76953: handler run complete 44842 1727204495.76987: Evaluated conditional (False): False 44842 1727204495.76997: attempt loop complete, returning result 44842 1727204495.77000: _execute() done 44842 1727204495.77011: dumping result to json 44842 1727204495.77022: done dumping result, returning 44842 1727204495.77031: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-aad0-d242-0000000002ac] 44842 1727204495.77037: sending task result for task 0affcd87-79f5-aad0-d242-0000000002ac ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004842", "end": "2024-09-24 15:01:35.719852", "rc": 0, "start": "2024-09-24 15:01:35.715010" } STDOUT: bonding_masters eth0 lo rpltstbr 44842 1727204495.77228: no more pending results, returning what we have 44842 1727204495.77232: results queue empty 44842 1727204495.77233: checking for any_errors_fatal 44842 1727204495.77235: done checking for any_errors_fatal 44842 1727204495.77235: checking for max_fail_percentage 44842 1727204495.77237: done checking for max_fail_percentage 44842 1727204495.77239: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.77240: done checking to see if all hosts have failed 44842 1727204495.77241: getting the remaining hosts for this loop 44842 1727204495.77243: done getting the remaining hosts for this loop 44842 1727204495.77247: getting the next task for host managed-node1 44842 1727204495.77255: done getting next task for host managed-node1 44842 1727204495.77258: ^ task is: TASK: Set current_interfaces 44842 1727204495.77267: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.77272: getting variables 44842 1727204495.77274: in VariableManager get_vars() 44842 1727204495.77313: Calling all_inventory to load vars for managed-node1 44842 1727204495.77316: Calling groups_inventory to load vars for managed-node1 44842 1727204495.77319: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.77331: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.77334: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.77337: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.77541: done sending task result for task 0affcd87-79f5-aad0-d242-0000000002ac 44842 1727204495.77545: WORKER PROCESS EXITING 44842 1727204495.77595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.78052: done with get_vars() 44842 1727204495.78070: done getting variables 44842 1727204495.78137: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.492) 0:00:05.949 ***** 44842 1727204495.78180: entering _queue_task() for managed-node1/set_fact 44842 1727204495.78457: worker is 1 (out of 1 available) 44842 1727204495.78473: exiting _queue_task() for managed-node1/set_fact 44842 1727204495.78487: done queuing things up, now waiting for results queue to drain 44842 1727204495.78489: waiting for pending results... 44842 1727204495.78767: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 44842 1727204495.78871: in run() - task 0affcd87-79f5-aad0-d242-0000000002ad 44842 1727204495.78884: variable 'ansible_search_path' from source: unknown 44842 1727204495.78893: variable 'ansible_search_path' from source: unknown 44842 1727204495.78926: calling self._execute() 44842 1727204495.79140: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.79143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.79279: variable 'omit' from source: magic vars 44842 1727204495.79585: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.79598: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.79605: variable 'omit' from source: magic vars 44842 1727204495.79684: variable 'omit' from source: magic vars 44842 1727204495.79798: variable '_current_interfaces' from source: set_fact 44842 1727204495.79862: variable 'omit' from source: magic vars 44842 1727204495.79917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204495.79951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204495.79977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204495.80000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.80011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.80043: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204495.80052: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.80055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.80171: Set connection var ansible_shell_type to sh 44842 1727204495.80182: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204495.80187: Set connection var ansible_connection to ssh 44842 1727204495.80193: Set connection var ansible_pipelining to False 44842 1727204495.80208: Set connection var ansible_timeout to 10 44842 1727204495.80215: Set connection var ansible_shell_executable to /bin/sh 44842 1727204495.80237: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.80240: variable 'ansible_connection' from source: unknown 44842 1727204495.80244: variable 'ansible_module_compression' from source: unknown 44842 1727204495.80247: variable 'ansible_shell_type' from source: unknown 44842 1727204495.80249: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.80251: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.80253: variable 'ansible_pipelining' from source: unknown 44842 1727204495.80255: variable 'ansible_timeout' from source: unknown 44842 1727204495.80260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.80427: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204495.80436: variable 'omit' from source: magic vars 44842 1727204495.80442: starting attempt loop 44842 1727204495.80445: running the handler 44842 1727204495.80456: handler run complete 44842 1727204495.80471: attempt loop complete, returning result 44842 1727204495.80474: _execute() done 44842 1727204495.80478: dumping result to json 44842 1727204495.80480: done dumping result, returning 44842 1727204495.80491: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-aad0-d242-0000000002ad] 44842 1727204495.80496: sending task result for task 0affcd87-79f5-aad0-d242-0000000002ad 44842 1727204495.80582: done sending task result for task 0affcd87-79f5-aad0-d242-0000000002ad 44842 1727204495.80585: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 44842 1727204495.80655: no more pending results, returning what we have 44842 1727204495.80659: results queue empty 44842 1727204495.80663: checking for any_errors_fatal 44842 1727204495.80674: done checking for any_errors_fatal 44842 1727204495.80675: checking for max_fail_percentage 44842 1727204495.80677: done checking for max_fail_percentage 44842 1727204495.80678: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.80679: done checking to see if all hosts have failed 44842 1727204495.80680: getting the remaining hosts for this loop 44842 1727204495.80682: done getting the remaining hosts for this loop 44842 1727204495.80687: getting the next task for host managed-node1 44842 1727204495.80696: done getting next task for host managed-node1 44842 1727204495.80699: ^ task is: TASK: Show current_interfaces 44842 1727204495.80704: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.80709: getting variables 44842 1727204495.80710: in VariableManager get_vars() 44842 1727204495.80748: Calling all_inventory to load vars for managed-node1 44842 1727204495.80750: Calling groups_inventory to load vars for managed-node1 44842 1727204495.80753: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.80768: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.80771: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.80774: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.81020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.81254: done with get_vars() 44842 1727204495.81271: done getting variables 44842 1727204495.81451: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.033) 0:00:05.983 ***** 44842 1727204495.81491: entering _queue_task() for managed-node1/debug 44842 1727204495.81980: worker is 1 (out of 1 available) 44842 1727204495.81992: exiting _queue_task() for managed-node1/debug 44842 1727204495.82004: done queuing things up, now waiting for results queue to drain 44842 1727204495.82006: waiting for pending results... 44842 1727204495.82902: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 44842 1727204495.83048: in run() - task 0affcd87-79f5-aad0-d242-000000000276 44842 1727204495.83106: variable 'ansible_search_path' from source: unknown 44842 1727204495.83111: variable 'ansible_search_path' from source: unknown 44842 1727204495.83179: calling self._execute() 44842 1727204495.83252: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.83256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.83296: variable 'omit' from source: magic vars 44842 1727204495.83670: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.83682: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.83689: variable 'omit' from source: magic vars 44842 1727204495.83746: variable 'omit' from source: magic vars 44842 1727204495.83855: variable 'current_interfaces' from source: set_fact 44842 1727204495.83885: variable 'omit' from source: magic vars 44842 1727204495.83928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204495.83972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204495.83993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204495.84010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.84022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.84064: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204495.84070: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.84074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.84178: Set connection var ansible_shell_type to sh 44842 1727204495.84189: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204495.84194: Set connection var ansible_connection to ssh 44842 1727204495.84199: Set connection var ansible_pipelining to False 44842 1727204495.84205: Set connection var ansible_timeout to 10 44842 1727204495.84212: Set connection var ansible_shell_executable to /bin/sh 44842 1727204495.84234: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.84237: variable 'ansible_connection' from source: unknown 44842 1727204495.84240: variable 'ansible_module_compression' from source: unknown 44842 1727204495.84242: variable 'ansible_shell_type' from source: unknown 44842 1727204495.84244: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.84252: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.84254: variable 'ansible_pipelining' from source: unknown 44842 1727204495.84258: variable 'ansible_timeout' from source: unknown 44842 1727204495.84275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.84415: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204495.84424: variable 'omit' from source: magic vars 44842 1727204495.84430: starting attempt loop 44842 1727204495.84433: running the handler 44842 1727204495.84492: handler run complete 44842 1727204495.84506: attempt loop complete, returning result 44842 1727204495.84509: _execute() done 44842 1727204495.84512: dumping result to json 44842 1727204495.84514: done dumping result, returning 44842 1727204495.84521: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-aad0-d242-000000000276] 44842 1727204495.84527: sending task result for task 0affcd87-79f5-aad0-d242-000000000276 ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 44842 1727204495.84672: no more pending results, returning what we have 44842 1727204495.84678: results queue empty 44842 1727204495.84680: checking for any_errors_fatal 44842 1727204495.84687: done checking for any_errors_fatal 44842 1727204495.84689: checking for max_fail_percentage 44842 1727204495.84690: done checking for max_fail_percentage 44842 1727204495.84691: checking to see if all hosts have failed and the running result is not ok 44842 1727204495.84692: done checking to see if all hosts have failed 44842 1727204495.84693: getting the remaining hosts for this loop 44842 1727204495.84696: done getting the remaining hosts for this loop 44842 1727204495.84700: getting the next task for host managed-node1 44842 1727204495.84709: done getting next task for host managed-node1 44842 1727204495.84712: ^ task is: TASK: Install iproute 44842 1727204495.84716: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204495.84721: getting variables 44842 1727204495.84722: in VariableManager get_vars() 44842 1727204495.84765: Calling all_inventory to load vars for managed-node1 44842 1727204495.84768: Calling groups_inventory to load vars for managed-node1 44842 1727204495.84770: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204495.84782: Calling all_plugins_play to load vars for managed-node1 44842 1727204495.84785: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204495.84788: Calling groups_plugins_play to load vars for managed-node1 44842 1727204495.85054: done sending task result for task 0affcd87-79f5-aad0-d242-000000000276 44842 1727204495.85067: WORKER PROCESS EXITING 44842 1727204495.85097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204495.85684: done with get_vars() 44842 1727204495.85696: done getting variables 44842 1727204495.85853: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:01:35 -0400 (0:00:00.043) 0:00:06.026 ***** 44842 1727204495.85886: entering _queue_task() for managed-node1/package 44842 1727204495.86148: worker is 1 (out of 1 available) 44842 1727204495.86170: exiting _queue_task() for managed-node1/package 44842 1727204495.86183: done queuing things up, now waiting for results queue to drain 44842 1727204495.86185: waiting for pending results... 44842 1727204495.86456: running TaskExecutor() for managed-node1/TASK: Install iproute 44842 1727204495.86560: in run() - task 0affcd87-79f5-aad0-d242-0000000001cf 44842 1727204495.86580: variable 'ansible_search_path' from source: unknown 44842 1727204495.86584: variable 'ansible_search_path' from source: unknown 44842 1727204495.86630: calling self._execute() 44842 1727204495.86716: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.86725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.86738: variable 'omit' from source: magic vars 44842 1727204495.87129: variable 'ansible_distribution_major_version' from source: facts 44842 1727204495.87144: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204495.87155: variable 'omit' from source: magic vars 44842 1727204495.87197: variable 'omit' from source: magic vars 44842 1727204495.87412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204495.90093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204495.90181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204495.90220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204495.90260: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204495.90293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204495.90451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204495.90455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204495.91169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204495.91173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204495.91175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204495.91177: variable '__network_is_ostree' from source: set_fact 44842 1727204495.91180: variable 'omit' from source: magic vars 44842 1727204495.91181: variable 'omit' from source: magic vars 44842 1727204495.91183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204495.91186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204495.91188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204495.91190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.91192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204495.91194: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204495.91196: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.91198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.91200: Set connection var ansible_shell_type to sh 44842 1727204495.91202: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204495.91204: Set connection var ansible_connection to ssh 44842 1727204495.91206: Set connection var ansible_pipelining to False 44842 1727204495.91208: Set connection var ansible_timeout to 10 44842 1727204495.91210: Set connection var ansible_shell_executable to /bin/sh 44842 1727204495.91212: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.91214: variable 'ansible_connection' from source: unknown 44842 1727204495.91215: variable 'ansible_module_compression' from source: unknown 44842 1727204495.91217: variable 'ansible_shell_type' from source: unknown 44842 1727204495.91219: variable 'ansible_shell_executable' from source: unknown 44842 1727204495.91221: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204495.91223: variable 'ansible_pipelining' from source: unknown 44842 1727204495.91225: variable 'ansible_timeout' from source: unknown 44842 1727204495.91227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204495.91230: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204495.91232: variable 'omit' from source: magic vars 44842 1727204495.91234: starting attempt loop 44842 1727204495.91236: running the handler 44842 1727204495.91238: variable 'ansible_facts' from source: unknown 44842 1727204495.91241: variable 'ansible_facts' from source: unknown 44842 1727204495.91244: _low_level_execute_command(): starting 44842 1727204495.91246: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204495.92100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204495.92111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.92122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.92135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.92183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.92193: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204495.92203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.92216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204495.92224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204495.92230: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204495.92237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.92246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.92259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.92272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.92280: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204495.92294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.92375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204495.92393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.92410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.92492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204495.94035: stdout chunk (state=3): >>>/root <<< 44842 1727204495.94224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204495.94227: stdout chunk (state=3): >>><<< 44842 1727204495.94238: stderr chunk (state=3): >>><<< 44842 1727204495.94259: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204495.94276: _low_level_execute_command(): starting 44842 1727204495.94282: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390 `" && echo ansible-tmp-1727204495.942626-45223-249900297522390="` echo /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390 `" ) && sleep 0' 44842 1727204495.96023: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204495.96103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.96114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.96128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.96171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.96179: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204495.96189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.96210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204495.96279: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204495.96286: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204495.96294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204495.96304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204495.96319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204495.96327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204495.96334: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204495.96343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204495.96420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204495.96550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204495.96568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204495.96658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204495.98492: stdout chunk (state=3): >>>ansible-tmp-1727204495.942626-45223-249900297522390=/root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390 <<< 44842 1727204495.98681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204495.98685: stdout chunk (state=3): >>><<< 44842 1727204495.98692: stderr chunk (state=3): >>><<< 44842 1727204495.98710: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204495.942626-45223-249900297522390=/root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204495.98748: variable 'ansible_module_compression' from source: unknown 44842 1727204495.98820: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 44842 1727204495.98824: ANSIBALLZ: Acquiring lock 44842 1727204495.98826: ANSIBALLZ: Lock acquired: 140164881036544 44842 1727204495.98828: ANSIBALLZ: Creating module 44842 1727204496.32366: ANSIBALLZ: Writing module into payload 44842 1727204496.32645: ANSIBALLZ: Writing module 44842 1727204496.32681: ANSIBALLZ: Renaming module 44842 1727204496.32691: ANSIBALLZ: Done creating module 44842 1727204496.32711: variable 'ansible_facts' from source: unknown 44842 1727204496.32803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390/AnsiballZ_dnf.py 44842 1727204496.33138: Sending initial data 44842 1727204496.33142: Sent initial data (151 bytes) 44842 1727204496.34497: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204496.34517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204496.34537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204496.34558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204496.34613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204496.34628: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204496.34647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204496.34668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204496.34688: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204496.34702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204496.34721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204496.34736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204496.34757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204496.34774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204496.34789: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204496.34808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204496.34894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204496.34924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204496.34943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204496.35053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204496.36883: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204496.36921: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204496.36975: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpouc61e_s /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390/AnsiballZ_dnf.py <<< 44842 1727204496.37035: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204496.38769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204496.39028: stderr chunk (state=3): >>><<< 44842 1727204496.39032: stdout chunk (state=3): >>><<< 44842 1727204496.39034: done transferring module to remote 44842 1727204496.39036: _low_level_execute_command(): starting 44842 1727204496.39038: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390/ /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390/AnsiballZ_dnf.py && sleep 0' 44842 1727204496.39660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204496.39677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204496.39692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204496.39710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204496.39752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204496.39771: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204496.39818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204496.39838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204496.39850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204496.39860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204496.39874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204496.39887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204496.39901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204496.39912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204496.39922: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204496.39935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204496.40012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204496.40033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204496.40050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204496.40132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204496.41946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204496.41950: stdout chunk (state=3): >>><<< 44842 1727204496.41955: stderr chunk (state=3): >>><<< 44842 1727204496.41984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204496.41988: _low_level_execute_command(): starting 44842 1727204496.41991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390/AnsiballZ_dnf.py && sleep 0' 44842 1727204496.42615: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204496.42630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204496.42639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204496.42651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204496.42691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204496.42697: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204496.42706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204496.42718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204496.42726: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204496.42738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204496.42745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204496.42754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204496.42767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204496.42775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204496.42783: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204496.42791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204496.42863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204496.42887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204496.42898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204496.42991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.33413: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 44842 1727204497.37645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204497.37649: stdout chunk (state=3): >>><<< 44842 1727204497.37656: stderr chunk (state=3): >>><<< 44842 1727204497.37683: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204497.37730: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204497.37736: _low_level_execute_command(): starting 44842 1727204497.37741: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204495.942626-45223-249900297522390/ > /dev/null 2>&1 && sleep 0' 44842 1727204497.39309: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.39456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.39479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.39498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.39545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.39563: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.39582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.39601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.39613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.39625: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.39639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.39668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.39782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.39796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.39808: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.39822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.39905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.39929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.39946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.40066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.41876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.41955: stderr chunk (state=3): >>><<< 44842 1727204497.41958: stdout chunk (state=3): >>><<< 44842 1727204497.42370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.42377: handler run complete 44842 1727204497.42379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204497.42381: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204497.42404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204497.42441: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204497.42480: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204497.42556: variable '__install_status' from source: unknown 44842 1727204497.42588: Evaluated conditional (__install_status is success): True 44842 1727204497.42609: attempt loop complete, returning result 44842 1727204497.42675: _execute() done 44842 1727204497.42683: dumping result to json 44842 1727204497.42692: done dumping result, returning 44842 1727204497.42704: done running TaskExecutor() for managed-node1/TASK: Install iproute [0affcd87-79f5-aad0-d242-0000000001cf] 44842 1727204497.42713: sending task result for task 0affcd87-79f5-aad0-d242-0000000001cf ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 44842 1727204497.42917: no more pending results, returning what we have 44842 1727204497.42921: results queue empty 44842 1727204497.42922: checking for any_errors_fatal 44842 1727204497.42929: done checking for any_errors_fatal 44842 1727204497.42930: checking for max_fail_percentage 44842 1727204497.42934: done checking for max_fail_percentage 44842 1727204497.42935: checking to see if all hosts have failed and the running result is not ok 44842 1727204497.42935: done checking to see if all hosts have failed 44842 1727204497.42936: getting the remaining hosts for this loop 44842 1727204497.42938: done getting the remaining hosts for this loop 44842 1727204497.42941: getting the next task for host managed-node1 44842 1727204497.42948: done getting next task for host managed-node1 44842 1727204497.42950: ^ task is: TASK: Create veth interface {{ interface }} 44842 1727204497.42953: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204497.42957: getting variables 44842 1727204497.42958: in VariableManager get_vars() 44842 1727204497.42998: Calling all_inventory to load vars for managed-node1 44842 1727204497.43002: Calling groups_inventory to load vars for managed-node1 44842 1727204497.43005: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204497.43017: Calling all_plugins_play to load vars for managed-node1 44842 1727204497.43020: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204497.43025: Calling groups_plugins_play to load vars for managed-node1 44842 1727204497.43196: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001cf 44842 1727204497.43200: WORKER PROCESS EXITING 44842 1727204497.43213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204497.43404: done with get_vars() 44842 1727204497.43420: done getting variables 44842 1727204497.43476: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204497.43822: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:01:37 -0400 (0:00:01.579) 0:00:07.606 ***** 44842 1727204497.43974: entering _queue_task() for managed-node1/command 44842 1727204497.44292: worker is 1 (out of 1 available) 44842 1727204497.44307: exiting _queue_task() for managed-node1/command 44842 1727204497.44321: done queuing things up, now waiting for results queue to drain 44842 1727204497.44322: waiting for pending results... 44842 1727204497.44597: running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 44842 1727204497.44690: in run() - task 0affcd87-79f5-aad0-d242-0000000001d0 44842 1727204497.44701: variable 'ansible_search_path' from source: unknown 44842 1727204497.44705: variable 'ansible_search_path' from source: unknown 44842 1727204497.44980: variable 'interface' from source: set_fact 44842 1727204497.45069: variable 'interface' from source: set_fact 44842 1727204497.45148: variable 'interface' from source: set_fact 44842 1727204497.45318: Loaded config def from plugin (lookup/items) 44842 1727204497.45325: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 44842 1727204497.45349: variable 'omit' from source: magic vars 44842 1727204497.45473: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204497.45481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204497.45492: variable 'omit' from source: magic vars 44842 1727204497.45810: variable 'ansible_distribution_major_version' from source: facts 44842 1727204497.45818: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204497.46623: variable 'type' from source: set_fact 44842 1727204497.46627: variable 'state' from source: include params 44842 1727204497.46630: variable 'interface' from source: set_fact 44842 1727204497.46634: variable 'current_interfaces' from source: set_fact 44842 1727204497.46642: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 44842 1727204497.46649: variable 'omit' from source: magic vars 44842 1727204497.46698: variable 'omit' from source: magic vars 44842 1727204497.46742: variable 'item' from source: unknown 44842 1727204497.46820: variable 'item' from source: unknown 44842 1727204497.46837: variable 'omit' from source: magic vars 44842 1727204497.46873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204497.46903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204497.46932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204497.46949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204497.46962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204497.46996: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204497.46999: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204497.47002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204497.47107: Set connection var ansible_shell_type to sh 44842 1727204497.47118: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204497.47131: Set connection var ansible_connection to ssh 44842 1727204497.47136: Set connection var ansible_pipelining to False 44842 1727204497.47142: Set connection var ansible_timeout to 10 44842 1727204497.47150: Set connection var ansible_shell_executable to /bin/sh 44842 1727204497.47208: variable 'ansible_shell_executable' from source: unknown 44842 1727204497.47211: variable 'ansible_connection' from source: unknown 44842 1727204497.47213: variable 'ansible_module_compression' from source: unknown 44842 1727204497.47217: variable 'ansible_shell_type' from source: unknown 44842 1727204497.47220: variable 'ansible_shell_executable' from source: unknown 44842 1727204497.47224: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204497.47226: variable 'ansible_pipelining' from source: unknown 44842 1727204497.47239: variable 'ansible_timeout' from source: unknown 44842 1727204497.47242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204497.47388: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204497.47398: variable 'omit' from source: magic vars 44842 1727204497.47403: starting attempt loop 44842 1727204497.47406: running the handler 44842 1727204497.47421: _low_level_execute_command(): starting 44842 1727204497.47428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204497.48157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.48171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.48183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.48200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.48247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.48253: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.48267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.48280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.48289: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.48295: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.48303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.48312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.48324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.48336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.48344: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.48358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.48426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.48450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.48469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.48547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.50087: stdout chunk (state=3): >>>/root <<< 44842 1727204497.50293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.50296: stdout chunk (state=3): >>><<< 44842 1727204497.50298: stderr chunk (state=3): >>><<< 44842 1727204497.50416: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.50426: _low_level_execute_command(): starting 44842 1727204497.50430: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499 `" && echo ansible-tmp-1727204497.5031874-45443-277953165623499="` echo /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499 `" ) && sleep 0' 44842 1727204497.51023: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.51038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.51052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.51082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.51122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.51133: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.51146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.51167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.51184: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.51200: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.51215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.51249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.51271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.51299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.51314: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.51341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.51478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.51499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.51520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.52280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.53540: stdout chunk (state=3): >>>ansible-tmp-1727204497.5031874-45443-277953165623499=/root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499 <<< 44842 1727204497.53663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.53738: stderr chunk (state=3): >>><<< 44842 1727204497.53742: stdout chunk (state=3): >>><<< 44842 1727204497.54023: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204497.5031874-45443-277953165623499=/root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.54026: variable 'ansible_module_compression' from source: unknown 44842 1727204497.54029: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204497.54031: variable 'ansible_facts' from source: unknown 44842 1727204497.54033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499/AnsiballZ_command.py 44842 1727204497.54407: Sending initial data 44842 1727204497.54410: Sent initial data (156 bytes) 44842 1727204497.55918: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.55931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.55944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.55963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.56120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.56131: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.56143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.56220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.56235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.56246: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.56259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.56280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.56298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.56311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.56329: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.56343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.56424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.56526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.56547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.56666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.58352: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204497.58405: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204497.58462: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp_nhtbm03 /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499/AnsiballZ_command.py <<< 44842 1727204497.58519: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204497.59971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.60075: stderr chunk (state=3): >>><<< 44842 1727204497.60078: stdout chunk (state=3): >>><<< 44842 1727204497.60080: done transferring module to remote 44842 1727204497.60082: _low_level_execute_command(): starting 44842 1727204497.60156: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499/ /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499/AnsiballZ_command.py && sleep 0' 44842 1727204497.60745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.60754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.60778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.60795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.60832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.60841: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.60851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.60868: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.60876: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.60883: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.60891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.60900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.60910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.60917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.60924: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.60933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.61009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.61027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.61039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.61120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.62916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.62922: stdout chunk (state=3): >>><<< 44842 1727204497.62925: stderr chunk (state=3): >>><<< 44842 1727204497.62927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.62932: _low_level_execute_command(): starting 44842 1727204497.62940: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499/AnsiballZ_command.py && sleep 0' 44842 1727204497.63557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.63573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.63876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.63879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.63882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.63884: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.63886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.63890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.63892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.63894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.63896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.63898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.63900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.63902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.63904: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.63906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.63908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.63910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.63912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.63913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.78031: stdout chunk (state=3): >>> <<< 44842 1727204497.78221: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:01:37.767306", "end": "2024-09-24 15:01:37.779207", "delta": "0:00:00.011901", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204497.80265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204497.80270: stdout chunk (state=3): >>><<< 44842 1727204497.80273: stderr chunk (state=3): >>><<< 44842 1727204497.80286: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:01:37.767306", "end": "2024-09-24 15:01:37.779207", "delta": "0:00:00.011901", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204497.80328: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204497.80336: _low_level_execute_command(): starting 44842 1727204497.80341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204497.5031874-45443-277953165623499/ > /dev/null 2>&1 && sleep 0' 44842 1727204497.81080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.81090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.81100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.81114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.81159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.81168: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.81178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.81196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.81203: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.81209: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.81217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.81227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.81239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.81249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.81256: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.81278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.81353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.81373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.81389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.81482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.84419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.84479: stderr chunk (state=3): >>><<< 44842 1727204497.84482: stdout chunk (state=3): >>><<< 44842 1727204497.84500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.84506: handler run complete 44842 1727204497.84526: Evaluated conditional (False): False 44842 1727204497.84566: attempt loop complete, returning result 44842 1727204497.84570: variable 'item' from source: unknown 44842 1727204497.84650: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.011901", "end": "2024-09-24 15:01:37.779207", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 15:01:37.767306" } 44842 1727204497.84828: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204497.84831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204497.84834: variable 'omit' from source: magic vars 44842 1727204497.84955: variable 'ansible_distribution_major_version' from source: facts 44842 1727204497.84958: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204497.85138: variable 'type' from source: set_fact 44842 1727204497.85141: variable 'state' from source: include params 44842 1727204497.85146: variable 'interface' from source: set_fact 44842 1727204497.85149: variable 'current_interfaces' from source: set_fact 44842 1727204497.85157: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 44842 1727204497.85162: variable 'omit' from source: magic vars 44842 1727204497.85184: variable 'omit' from source: magic vars 44842 1727204497.85226: variable 'item' from source: unknown 44842 1727204497.85292: variable 'item' from source: unknown 44842 1727204497.85305: variable 'omit' from source: magic vars 44842 1727204497.85327: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204497.85335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204497.85341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204497.85356: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204497.85359: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204497.85361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204497.85438: Set connection var ansible_shell_type to sh 44842 1727204497.85447: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204497.85452: Set connection var ansible_connection to ssh 44842 1727204497.85457: Set connection var ansible_pipelining to False 44842 1727204497.85466: Set connection var ansible_timeout to 10 44842 1727204497.85474: Set connection var ansible_shell_executable to /bin/sh 44842 1727204497.85494: variable 'ansible_shell_executable' from source: unknown 44842 1727204497.85497: variable 'ansible_connection' from source: unknown 44842 1727204497.85500: variable 'ansible_module_compression' from source: unknown 44842 1727204497.85508: variable 'ansible_shell_type' from source: unknown 44842 1727204497.85510: variable 'ansible_shell_executable' from source: unknown 44842 1727204497.85512: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204497.85515: variable 'ansible_pipelining' from source: unknown 44842 1727204497.85517: variable 'ansible_timeout' from source: unknown 44842 1727204497.85518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204497.85613: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204497.85639: variable 'omit' from source: magic vars 44842 1727204497.85644: starting attempt loop 44842 1727204497.85647: running the handler 44842 1727204497.85654: _low_level_execute_command(): starting 44842 1727204497.85659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204497.86275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.86281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.86287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.86333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.86336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.86338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.86398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.86401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.86459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.87999: stdout chunk (state=3): >>>/root <<< 44842 1727204497.88117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.88155: stderr chunk (state=3): >>><<< 44842 1727204497.88159: stdout chunk (state=3): >>><<< 44842 1727204497.88174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.88213: _low_level_execute_command(): starting 44842 1727204497.88216: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658 `" && echo ansible-tmp-1727204497.8817456-45443-265016127709658="` echo /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658 `" ) && sleep 0' 44842 1727204497.88827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.88845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.88868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.88892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.88931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.88943: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.88955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.88979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.88992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.89007: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.89018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.89030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.89043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.89055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.89077: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.89091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.89174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.89197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.89219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.89309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.91142: stdout chunk (state=3): >>>ansible-tmp-1727204497.8817456-45443-265016127709658=/root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658 <<< 44842 1727204497.91269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.91320: stderr chunk (state=3): >>><<< 44842 1727204497.91322: stdout chunk (state=3): >>><<< 44842 1727204497.91361: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204497.8817456-45443-265016127709658=/root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.91370: variable 'ansible_module_compression' from source: unknown 44842 1727204497.91393: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204497.91428: variable 'ansible_facts' from source: unknown 44842 1727204497.91516: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658/AnsiballZ_command.py 44842 1727204497.92145: Sending initial data 44842 1727204497.92149: Sent initial data (156 bytes) 44842 1727204497.92601: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.92607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.92615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.92624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.92682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204497.92823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.92827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.92894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.92945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.94643: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204497.94695: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204497.94756: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp6kv3spf4 /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658/AnsiballZ_command.py <<< 44842 1727204497.94800: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204497.96080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.96344: stderr chunk (state=3): >>><<< 44842 1727204497.96348: stdout chunk (state=3): >>><<< 44842 1727204497.96351: done transferring module to remote 44842 1727204497.96353: _low_level_execute_command(): starting 44842 1727204497.96355: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658/ /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658/AnsiballZ_command.py && sleep 0' 44842 1727204497.97590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204497.97594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.97609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.97616: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204497.97626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.97640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204497.97647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204497.97654: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204497.97661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204497.97677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204497.97688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204497.97704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204497.97711: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204497.97720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204497.97798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204497.97817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204497.97830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204497.97917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204497.99711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204497.99715: stdout chunk (state=3): >>><<< 44842 1727204497.99722: stderr chunk (state=3): >>><<< 44842 1727204497.99746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204497.99749: _low_level_execute_command(): starting 44842 1727204497.99754: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658/AnsiballZ_command.py && sleep 0' 44842 1727204498.00805: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.00808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.00816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.00830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.00873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.00881: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.00891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.00904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.00911: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.00970: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.00974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.00976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.00978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.00980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.00982: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.00984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.01050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.01071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.01082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.01927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.14713: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:01:38.143131", "end": "2024-09-24 15:01:38.146406", "delta": "0:00:00.003275", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204498.15859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204498.15863: stdout chunk (state=3): >>><<< 44842 1727204498.15873: stderr chunk (state=3): >>><<< 44842 1727204498.15893: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:01:38.143131", "end": "2024-09-24 15:01:38.146406", "delta": "0:00:00.003275", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204498.15921: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204498.15927: _low_level_execute_command(): starting 44842 1727204498.15932: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204497.8817456-45443-265016127709658/ > /dev/null 2>&1 && sleep 0' 44842 1727204498.17257: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.18182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.18192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.18206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.18247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.18253: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.18267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.18279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.18288: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.18294: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.18302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.18310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.18321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.18329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.18335: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.18344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.18421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.18440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.18451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.18539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.20375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.20379: stdout chunk (state=3): >>><<< 44842 1727204498.20386: stderr chunk (state=3): >>><<< 44842 1727204498.20419: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.20426: handler run complete 44842 1727204498.20446: Evaluated conditional (False): False 44842 1727204498.20455: attempt loop complete, returning result 44842 1727204498.20481: variable 'item' from source: unknown 44842 1727204498.20561: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003275", "end": "2024-09-24 15:01:38.146406", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 15:01:38.143131" } 44842 1727204498.20691: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204498.20694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204498.20701: variable 'omit' from source: magic vars 44842 1727204498.20873: variable 'ansible_distribution_major_version' from source: facts 44842 1727204498.20879: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204498.21216: variable 'type' from source: set_fact 44842 1727204498.21219: variable 'state' from source: include params 44842 1727204498.21224: variable 'interface' from source: set_fact 44842 1727204498.21226: variable 'current_interfaces' from source: set_fact 44842 1727204498.21234: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 44842 1727204498.21237: variable 'omit' from source: magic vars 44842 1727204498.21262: variable 'omit' from source: magic vars 44842 1727204498.21306: variable 'item' from source: unknown 44842 1727204498.21551: variable 'item' from source: unknown 44842 1727204498.21573: variable 'omit' from source: magic vars 44842 1727204498.21595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204498.21603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204498.21610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204498.21630: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204498.21633: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204498.21636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204498.21769: Set connection var ansible_shell_type to sh 44842 1727204498.21852: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204498.21855: Set connection var ansible_connection to ssh 44842 1727204498.21862: Set connection var ansible_pipelining to False 44842 1727204498.21873: Set connection var ansible_timeout to 10 44842 1727204498.21880: Set connection var ansible_shell_executable to /bin/sh 44842 1727204498.21900: variable 'ansible_shell_executable' from source: unknown 44842 1727204498.21903: variable 'ansible_connection' from source: unknown 44842 1727204498.21905: variable 'ansible_module_compression' from source: unknown 44842 1727204498.21908: variable 'ansible_shell_type' from source: unknown 44842 1727204498.21910: variable 'ansible_shell_executable' from source: unknown 44842 1727204498.21912: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204498.21916: variable 'ansible_pipelining' from source: unknown 44842 1727204498.21919: variable 'ansible_timeout' from source: unknown 44842 1727204498.21923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204498.22349: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204498.22357: variable 'omit' from source: magic vars 44842 1727204498.22360: starting attempt loop 44842 1727204498.22368: running the handler 44842 1727204498.22375: _low_level_execute_command(): starting 44842 1727204498.22378: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204498.23853: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.23861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.23879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.23891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.23927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.23934: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.23944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.23958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.23973: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.23980: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.23990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.23997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.24009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.24017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.24024: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.24033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.24107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.24125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.24138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.24215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.25776: stdout chunk (state=3): >>>/root <<< 44842 1727204498.25941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.25945: stdout chunk (state=3): >>><<< 44842 1727204498.25951: stderr chunk (state=3): >>><<< 44842 1727204498.25972: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.25980: _low_level_execute_command(): starting 44842 1727204498.25986: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440 `" && echo ansible-tmp-1727204498.2597196-45443-268450405962440="` echo /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440 `" ) && sleep 0' 44842 1727204498.26936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.26940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.27037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.27041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.27117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.27120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.27137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.27212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.29085: stdout chunk (state=3): >>>ansible-tmp-1727204498.2597196-45443-268450405962440=/root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440 <<< 44842 1727204498.29403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.29408: stdout chunk (state=3): >>><<< 44842 1727204498.29410: stderr chunk (state=3): >>><<< 44842 1727204498.29412: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204498.2597196-45443-268450405962440=/root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.29415: variable 'ansible_module_compression' from source: unknown 44842 1727204498.29417: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204498.29418: variable 'ansible_facts' from source: unknown 44842 1727204498.29420: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440/AnsiballZ_command.py 44842 1727204498.29914: Sending initial data 44842 1727204498.29918: Sent initial data (156 bytes) 44842 1727204498.30847: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.30853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.30896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.30902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204498.30917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.30922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204498.30929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.31012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.31020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.31035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.31114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.32808: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204498.32860: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204498.32916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpz9r3t5mm /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440/AnsiballZ_command.py <<< 44842 1727204498.32969: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204498.35020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.35052: stderr chunk (state=3): >>><<< 44842 1727204498.35055: stdout chunk (state=3): >>><<< 44842 1727204498.35088: done transferring module to remote 44842 1727204498.35096: _low_level_execute_command(): starting 44842 1727204498.35101: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440/ /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440/AnsiballZ_command.py && sleep 0' 44842 1727204498.36130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.36140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.36151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.36169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.36208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.36330: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.36340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.36353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.36360: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.36373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.36379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.36390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.36401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.36409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.36415: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.36425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.36503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.36521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.36533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.36732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.38495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.38498: stdout chunk (state=3): >>><<< 44842 1727204498.38505: stderr chunk (state=3): >>><<< 44842 1727204498.38541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.38545: _low_level_execute_command(): starting 44842 1727204498.38549: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440/AnsiballZ_command.py && sleep 0' 44842 1727204498.39475: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.39534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.39544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.39560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.39605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.39612: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.39622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.39636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.39644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.39651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.39658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.39679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.39689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.39696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.39703: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.39712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.39793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.39811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.39823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.39915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.53444: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:01:38.528672", "end": "2024-09-24 15:01:38.533554", "delta": "0:00:00.004882", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204498.54671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204498.54679: stdout chunk (state=3): >>><<< 44842 1727204498.54683: stderr chunk (state=3): >>><<< 44842 1727204498.54703: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:01:38.528672", "end": "2024-09-24 15:01:38.533554", "delta": "0:00:00.004882", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204498.54733: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204498.54739: _low_level_execute_command(): starting 44842 1727204498.54744: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204498.2597196-45443-268450405962440/ > /dev/null 2>&1 && sleep 0' 44842 1727204498.55576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.55585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.55596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.55609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.55650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.55657: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.55677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.55691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.55697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.55704: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.55714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.55723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.55734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.55741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.55747: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.55757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.55833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.55850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.55862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.55949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.57759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.57763: stdout chunk (state=3): >>><<< 44842 1727204498.57780: stderr chunk (state=3): >>><<< 44842 1727204498.57797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.57802: handler run complete 44842 1727204498.57824: Evaluated conditional (False): False 44842 1727204498.57835: attempt loop complete, returning result 44842 1727204498.57856: variable 'item' from source: unknown 44842 1727204498.57946: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004882", "end": "2024-09-24 15:01:38.533554", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 15:01:38.528672" } 44842 1727204498.58077: dumping result to json 44842 1727204498.58081: done dumping result, returning 44842 1727204498.58084: done running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 [0affcd87-79f5-aad0-d242-0000000001d0] 44842 1727204498.58087: sending task result for task 0affcd87-79f5-aad0-d242-0000000001d0 44842 1727204498.58217: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001d0 44842 1727204498.58220: WORKER PROCESS EXITING 44842 1727204498.58298: no more pending results, returning what we have 44842 1727204498.58301: results queue empty 44842 1727204498.58302: checking for any_errors_fatal 44842 1727204498.58306: done checking for any_errors_fatal 44842 1727204498.58307: checking for max_fail_percentage 44842 1727204498.58308: done checking for max_fail_percentage 44842 1727204498.58309: checking to see if all hosts have failed and the running result is not ok 44842 1727204498.58310: done checking to see if all hosts have failed 44842 1727204498.58311: getting the remaining hosts for this loop 44842 1727204498.58313: done getting the remaining hosts for this loop 44842 1727204498.58316: getting the next task for host managed-node1 44842 1727204498.58323: done getting next task for host managed-node1 44842 1727204498.58326: ^ task is: TASK: Set up veth as managed by NetworkManager 44842 1727204498.58331: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204498.58334: getting variables 44842 1727204498.58336: in VariableManager get_vars() 44842 1727204498.58372: Calling all_inventory to load vars for managed-node1 44842 1727204498.58375: Calling groups_inventory to load vars for managed-node1 44842 1727204498.58378: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204498.58389: Calling all_plugins_play to load vars for managed-node1 44842 1727204498.58391: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204498.58394: Calling groups_plugins_play to load vars for managed-node1 44842 1727204498.58621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204498.59237: done with get_vars() 44842 1727204498.59250: done getting variables 44842 1727204498.59315: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:01:38 -0400 (0:00:01.167) 0:00:08.773 ***** 44842 1727204498.60576: entering _queue_task() for managed-node1/command 44842 1727204498.61253: worker is 1 (out of 1 available) 44842 1727204498.61268: exiting _queue_task() for managed-node1/command 44842 1727204498.61281: done queuing things up, now waiting for results queue to drain 44842 1727204498.61282: waiting for pending results... 44842 1727204498.61942: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 44842 1727204498.62157: in run() - task 0affcd87-79f5-aad0-d242-0000000001d1 44842 1727204498.62290: variable 'ansible_search_path' from source: unknown 44842 1727204498.62299: variable 'ansible_search_path' from source: unknown 44842 1727204498.62342: calling self._execute() 44842 1727204498.62551: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204498.62566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204498.62581: variable 'omit' from source: magic vars 44842 1727204498.63326: variable 'ansible_distribution_major_version' from source: facts 44842 1727204498.63346: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204498.63640: variable 'type' from source: set_fact 44842 1727204498.63706: variable 'state' from source: include params 44842 1727204498.63717: Evaluated conditional (type == 'veth' and state == 'present'): True 44842 1727204498.63729: variable 'omit' from source: magic vars 44842 1727204498.63847: variable 'omit' from source: magic vars 44842 1727204498.64020: variable 'interface' from source: set_fact 44842 1727204498.64155: variable 'omit' from source: magic vars 44842 1727204498.64205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204498.64356: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204498.64386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204498.64407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204498.64423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204498.64464: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204498.64481: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204498.64490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204498.64668: Set connection var ansible_shell_type to sh 44842 1727204498.64802: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204498.64813: Set connection var ansible_connection to ssh 44842 1727204498.64822: Set connection var ansible_pipelining to False 44842 1727204498.64832: Set connection var ansible_timeout to 10 44842 1727204498.64845: Set connection var ansible_shell_executable to /bin/sh 44842 1727204498.64877: variable 'ansible_shell_executable' from source: unknown 44842 1727204498.64886: variable 'ansible_connection' from source: unknown 44842 1727204498.64897: variable 'ansible_module_compression' from source: unknown 44842 1727204498.64905: variable 'ansible_shell_type' from source: unknown 44842 1727204498.65012: variable 'ansible_shell_executable' from source: unknown 44842 1727204498.65020: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204498.65029: variable 'ansible_pipelining' from source: unknown 44842 1727204498.65036: variable 'ansible_timeout' from source: unknown 44842 1727204498.65044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204498.65193: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204498.65345: variable 'omit' from source: magic vars 44842 1727204498.65354: starting attempt loop 44842 1727204498.65363: running the handler 44842 1727204498.65383: _low_level_execute_command(): starting 44842 1727204498.65394: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204498.66621: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.66626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.66665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204498.66670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.66673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.66723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.67389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.67392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.67589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.69008: stdout chunk (state=3): >>>/root <<< 44842 1727204498.69108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.69196: stderr chunk (state=3): >>><<< 44842 1727204498.69201: stdout chunk (state=3): >>><<< 44842 1727204498.69332: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.69336: _low_level_execute_command(): starting 44842 1727204498.69340: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560 `" && echo ansible-tmp-1727204498.6922674-45709-163665055990560="` echo /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560 `" ) && sleep 0' 44842 1727204498.70681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 44842 1727204498.70990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.70997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.71065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.72905: stdout chunk (state=3): >>>ansible-tmp-1727204498.6922674-45709-163665055990560=/root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560 <<< 44842 1727204498.73017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.73096: stderr chunk (state=3): >>><<< 44842 1727204498.73100: stdout chunk (state=3): >>><<< 44842 1727204498.73439: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204498.6922674-45709-163665055990560=/root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.73443: variable 'ansible_module_compression' from source: unknown 44842 1727204498.73446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204498.73448: variable 'ansible_facts' from source: unknown 44842 1727204498.73450: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560/AnsiballZ_command.py 44842 1727204498.73818: Sending initial data 44842 1727204498.73821: Sent initial data (156 bytes) 44842 1727204498.75866: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.75885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.75901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.75919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.75973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.75987: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.76002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.76021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.76034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.76046: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.76066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.76082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.76098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.76112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.76124: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.76139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.76919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.76944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.76966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.77054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.78751: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204498.78796: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204498.78851: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpga0icqm5 /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560/AnsiballZ_command.py <<< 44842 1727204498.78898: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204498.80321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.80369: stderr chunk (state=3): >>><<< 44842 1727204498.80372: stdout chunk (state=3): >>><<< 44842 1727204498.80374: done transferring module to remote 44842 1727204498.80447: _low_level_execute_command(): starting 44842 1727204498.80451: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560/ /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560/AnsiballZ_command.py && sleep 0' 44842 1727204498.82014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204498.82028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.82045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.82066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.82110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.82123: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204498.82137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.82155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204498.82170: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204498.82185: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204498.82198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.82212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204498.82228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.82241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204498.82257: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204498.82275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.82351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.82891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204498.82909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.83005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204498.84807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204498.84811: stdout chunk (state=3): >>><<< 44842 1727204498.84813: stderr chunk (state=3): >>><<< 44842 1727204498.84913: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204498.84917: _low_level_execute_command(): starting 44842 1727204498.84919: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560/AnsiballZ_command.py && sleep 0' 44842 1727204498.86525: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.86529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204498.86690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204498.86693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.86695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204498.86697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204498.86699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204498.86768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204498.86772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204498.86996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.01932: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:01:38.998614", "end": "2024-09-24 15:01:39.018260", "delta": "0:00:00.019646", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204499.03087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204499.03194: stderr chunk (state=3): >>><<< 44842 1727204499.03198: stdout chunk (state=3): >>><<< 44842 1727204499.03340: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:01:38.998614", "end": "2024-09-24 15:01:39.018260", "delta": "0:00:00.019646", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204499.03345: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204499.03348: _low_level_execute_command(): starting 44842 1727204499.03350: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204498.6922674-45709-163665055990560/ > /dev/null 2>&1 && sleep 0' 44842 1727204499.04811: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204499.04825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.04839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.04862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.04908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.04970: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204499.04985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.05002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204499.05013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204499.05022: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204499.05033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.05045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.05058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.05181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.05193: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204499.05206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.05289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.05312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204499.05326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.05415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.07249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.07252: stdout chunk (state=3): >>><<< 44842 1727204499.07256: stderr chunk (state=3): >>><<< 44842 1727204499.07471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204499.07475: handler run complete 44842 1727204499.07477: Evaluated conditional (False): False 44842 1727204499.07479: attempt loop complete, returning result 44842 1727204499.07480: _execute() done 44842 1727204499.07482: dumping result to json 44842 1727204499.07483: done dumping result, returning 44842 1727204499.07485: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-aad0-d242-0000000001d1] 44842 1727204499.07486: sending task result for task 0affcd87-79f5-aad0-d242-0000000001d1 44842 1727204499.07553: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001d1 44842 1727204499.07555: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.019646", "end": "2024-09-24 15:01:39.018260", "rc": 0, "start": "2024-09-24 15:01:38.998614" } 44842 1727204499.07629: no more pending results, returning what we have 44842 1727204499.07632: results queue empty 44842 1727204499.07633: checking for any_errors_fatal 44842 1727204499.07645: done checking for any_errors_fatal 44842 1727204499.07646: checking for max_fail_percentage 44842 1727204499.07648: done checking for max_fail_percentage 44842 1727204499.07649: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.07649: done checking to see if all hosts have failed 44842 1727204499.07650: getting the remaining hosts for this loop 44842 1727204499.07651: done getting the remaining hosts for this loop 44842 1727204499.07654: getting the next task for host managed-node1 44842 1727204499.07662: done getting next task for host managed-node1 44842 1727204499.07669: ^ task is: TASK: Delete veth interface {{ interface }} 44842 1727204499.07672: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.07676: getting variables 44842 1727204499.07677: in VariableManager get_vars() 44842 1727204499.07711: Calling all_inventory to load vars for managed-node1 44842 1727204499.07714: Calling groups_inventory to load vars for managed-node1 44842 1727204499.07716: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.07726: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.07729: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.07732: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.07927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.08167: done with get_vars() 44842 1727204499.08183: done getting variables 44842 1727204499.08240: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204499.08481: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.480) 0:00:09.254 ***** 44842 1727204499.08623: entering _queue_task() for managed-node1/command 44842 1727204499.09194: worker is 1 (out of 1 available) 44842 1727204499.09205: exiting _queue_task() for managed-node1/command 44842 1727204499.09219: done queuing things up, now waiting for results queue to drain 44842 1727204499.09220: waiting for pending results... 44842 1727204499.09904: running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 44842 1727204499.10499: in run() - task 0affcd87-79f5-aad0-d242-0000000001d2 44842 1727204499.10518: variable 'ansible_search_path' from source: unknown 44842 1727204499.10525: variable 'ansible_search_path' from source: unknown 44842 1727204499.10569: calling self._execute() 44842 1727204499.10730: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.10740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.10754: variable 'omit' from source: magic vars 44842 1727204499.11072: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.11784: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.11987: variable 'type' from source: set_fact 44842 1727204499.11997: variable 'state' from source: include params 44842 1727204499.12005: variable 'interface' from source: set_fact 44842 1727204499.12012: variable 'current_interfaces' from source: set_fact 44842 1727204499.12024: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 44842 1727204499.12030: when evaluation is False, skipping this task 44842 1727204499.12037: _execute() done 44842 1727204499.12045: dumping result to json 44842 1727204499.12051: done dumping result, returning 44842 1727204499.12059: done running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 [0affcd87-79f5-aad0-d242-0000000001d2] 44842 1727204499.12073: sending task result for task 0affcd87-79f5-aad0-d242-0000000001d2 44842 1727204499.12173: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001d2 44842 1727204499.12180: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 44842 1727204499.12312: no more pending results, returning what we have 44842 1727204499.12316: results queue empty 44842 1727204499.12317: checking for any_errors_fatal 44842 1727204499.12321: done checking for any_errors_fatal 44842 1727204499.12322: checking for max_fail_percentage 44842 1727204499.12324: done checking for max_fail_percentage 44842 1727204499.12325: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.12325: done checking to see if all hosts have failed 44842 1727204499.12326: getting the remaining hosts for this loop 44842 1727204499.12328: done getting the remaining hosts for this loop 44842 1727204499.12332: getting the next task for host managed-node1 44842 1727204499.12338: done getting next task for host managed-node1 44842 1727204499.12340: ^ task is: TASK: Create dummy interface {{ interface }} 44842 1727204499.12343: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.12347: getting variables 44842 1727204499.12348: in VariableManager get_vars() 44842 1727204499.12380: Calling all_inventory to load vars for managed-node1 44842 1727204499.12382: Calling groups_inventory to load vars for managed-node1 44842 1727204499.12384: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.12394: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.12396: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.12398: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.12575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.12806: done with get_vars() 44842 1727204499.12817: done getting variables 44842 1727204499.12968: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204499.13191: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.047) 0:00:09.301 ***** 44842 1727204499.13334: entering _queue_task() for managed-node1/command 44842 1727204499.13819: worker is 1 (out of 1 available) 44842 1727204499.13833: exiting _queue_task() for managed-node1/command 44842 1727204499.13845: done queuing things up, now waiting for results queue to drain 44842 1727204499.13846: waiting for pending results... 44842 1727204499.15007: running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 44842 1727204499.15106: in run() - task 0affcd87-79f5-aad0-d242-0000000001d3 44842 1727204499.15125: variable 'ansible_search_path' from source: unknown 44842 1727204499.15132: variable 'ansible_search_path' from source: unknown 44842 1727204499.15173: calling self._execute() 44842 1727204499.15253: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.15265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.15280: variable 'omit' from source: magic vars 44842 1727204499.15601: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.16283: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.16483: variable 'type' from source: set_fact 44842 1727204499.16492: variable 'state' from source: include params 44842 1727204499.16500: variable 'interface' from source: set_fact 44842 1727204499.16507: variable 'current_interfaces' from source: set_fact 44842 1727204499.16518: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 44842 1727204499.16524: when evaluation is False, skipping this task 44842 1727204499.16531: _execute() done 44842 1727204499.16538: dumping result to json 44842 1727204499.16544: done dumping result, returning 44842 1727204499.16553: done running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 [0affcd87-79f5-aad0-d242-0000000001d3] 44842 1727204499.16566: sending task result for task 0affcd87-79f5-aad0-d242-0000000001d3 44842 1727204499.16668: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001d3 44842 1727204499.16676: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 44842 1727204499.16728: no more pending results, returning what we have 44842 1727204499.16732: results queue empty 44842 1727204499.16733: checking for any_errors_fatal 44842 1727204499.16739: done checking for any_errors_fatal 44842 1727204499.16740: checking for max_fail_percentage 44842 1727204499.16741: done checking for max_fail_percentage 44842 1727204499.16742: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.16743: done checking to see if all hosts have failed 44842 1727204499.16744: getting the remaining hosts for this loop 44842 1727204499.16745: done getting the remaining hosts for this loop 44842 1727204499.16749: getting the next task for host managed-node1 44842 1727204499.16756: done getting next task for host managed-node1 44842 1727204499.16758: ^ task is: TASK: Delete dummy interface {{ interface }} 44842 1727204499.16766: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.16771: getting variables 44842 1727204499.16773: in VariableManager get_vars() 44842 1727204499.16811: Calling all_inventory to load vars for managed-node1 44842 1727204499.16813: Calling groups_inventory to load vars for managed-node1 44842 1727204499.16816: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.16827: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.16829: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.16831: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.17021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.17285: done with get_vars() 44842 1727204499.17296: done getting variables 44842 1727204499.17350: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204499.18183: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.048) 0:00:09.350 ***** 44842 1727204499.18254: entering _queue_task() for managed-node1/command 44842 1727204499.19125: worker is 1 (out of 1 available) 44842 1727204499.19138: exiting _queue_task() for managed-node1/command 44842 1727204499.19150: done queuing things up, now waiting for results queue to drain 44842 1727204499.19152: waiting for pending results... 44842 1727204499.20094: running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 44842 1727204499.20339: in run() - task 0affcd87-79f5-aad0-d242-0000000001d4 44842 1727204499.20381: variable 'ansible_search_path' from source: unknown 44842 1727204499.20417: variable 'ansible_search_path' from source: unknown 44842 1727204499.20523: calling self._execute() 44842 1727204499.20668: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.20793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.20811: variable 'omit' from source: magic vars 44842 1727204499.21466: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.21484: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.21691: variable 'type' from source: set_fact 44842 1727204499.21701: variable 'state' from source: include params 44842 1727204499.21708: variable 'interface' from source: set_fact 44842 1727204499.21714: variable 'current_interfaces' from source: set_fact 44842 1727204499.21724: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 44842 1727204499.21730: when evaluation is False, skipping this task 44842 1727204499.21736: _execute() done 44842 1727204499.21741: dumping result to json 44842 1727204499.21747: done dumping result, returning 44842 1727204499.21755: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 [0affcd87-79f5-aad0-d242-0000000001d4] 44842 1727204499.21771: sending task result for task 0affcd87-79f5-aad0-d242-0000000001d4 44842 1727204499.21873: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001d4 44842 1727204499.21880: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 44842 1727204499.21932: no more pending results, returning what we have 44842 1727204499.21936: results queue empty 44842 1727204499.21937: checking for any_errors_fatal 44842 1727204499.21945: done checking for any_errors_fatal 44842 1727204499.21946: checking for max_fail_percentage 44842 1727204499.21947: done checking for max_fail_percentage 44842 1727204499.21948: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.21949: done checking to see if all hosts have failed 44842 1727204499.21950: getting the remaining hosts for this loop 44842 1727204499.21951: done getting the remaining hosts for this loop 44842 1727204499.21955: getting the next task for host managed-node1 44842 1727204499.21965: done getting next task for host managed-node1 44842 1727204499.21968: ^ task is: TASK: Create tap interface {{ interface }} 44842 1727204499.21971: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.21976: getting variables 44842 1727204499.21977: in VariableManager get_vars() 44842 1727204499.22018: Calling all_inventory to load vars for managed-node1 44842 1727204499.22021: Calling groups_inventory to load vars for managed-node1 44842 1727204499.22023: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.22035: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.22037: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.22041: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.22229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.22452: done with get_vars() 44842 1727204499.22468: done getting variables 44842 1727204499.22527: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204499.22755: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.045) 0:00:09.396 ***** 44842 1727204499.22790: entering _queue_task() for managed-node1/command 44842 1727204499.23304: worker is 1 (out of 1 available) 44842 1727204499.23317: exiting _queue_task() for managed-node1/command 44842 1727204499.23343: done queuing things up, now waiting for results queue to drain 44842 1727204499.23345: waiting for pending results... 44842 1727204499.23622: running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 44842 1727204499.23736: in run() - task 0affcd87-79f5-aad0-d242-0000000001d5 44842 1727204499.23756: variable 'ansible_search_path' from source: unknown 44842 1727204499.23774: variable 'ansible_search_path' from source: unknown 44842 1727204499.23818: calling self._execute() 44842 1727204499.23912: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.23922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.23938: variable 'omit' from source: magic vars 44842 1727204499.24314: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.24336: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.24566: variable 'type' from source: set_fact 44842 1727204499.24578: variable 'state' from source: include params 44842 1727204499.24588: variable 'interface' from source: set_fact 44842 1727204499.24597: variable 'current_interfaces' from source: set_fact 44842 1727204499.24609: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 44842 1727204499.24616: when evaluation is False, skipping this task 44842 1727204499.24623: _execute() done 44842 1727204499.24637: dumping result to json 44842 1727204499.24645: done dumping result, returning 44842 1727204499.24659: done running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 [0affcd87-79f5-aad0-d242-0000000001d5] 44842 1727204499.24674: sending task result for task 0affcd87-79f5-aad0-d242-0000000001d5 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 44842 1727204499.24820: no more pending results, returning what we have 44842 1727204499.24824: results queue empty 44842 1727204499.24826: checking for any_errors_fatal 44842 1727204499.24871: done checking for any_errors_fatal 44842 1727204499.24872: checking for max_fail_percentage 44842 1727204499.24874: done checking for max_fail_percentage 44842 1727204499.24876: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.24876: done checking to see if all hosts have failed 44842 1727204499.24877: getting the remaining hosts for this loop 44842 1727204499.24880: done getting the remaining hosts for this loop 44842 1727204499.24885: getting the next task for host managed-node1 44842 1727204499.24893: done getting next task for host managed-node1 44842 1727204499.24895: ^ task is: TASK: Delete tap interface {{ interface }} 44842 1727204499.24899: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.24903: getting variables 44842 1727204499.24904: in VariableManager get_vars() 44842 1727204499.25234: Calling all_inventory to load vars for managed-node1 44842 1727204499.25238: Calling groups_inventory to load vars for managed-node1 44842 1727204499.25242: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.25272: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.25275: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.25278: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.25643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.25993: done with get_vars() 44842 1727204499.26003: done getting variables 44842 1727204499.26092: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204499.26214: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.034) 0:00:09.430 ***** 44842 1727204499.26249: entering _queue_task() for managed-node1/command 44842 1727204499.26261: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001d5 44842 1727204499.26265: WORKER PROCESS EXITING 44842 1727204499.26468: worker is 1 (out of 1 available) 44842 1727204499.26486: exiting _queue_task() for managed-node1/command 44842 1727204499.26498: done queuing things up, now waiting for results queue to drain 44842 1727204499.26499: waiting for pending results... 44842 1727204499.26653: running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 44842 1727204499.26720: in run() - task 0affcd87-79f5-aad0-d242-0000000001d6 44842 1727204499.26730: variable 'ansible_search_path' from source: unknown 44842 1727204499.26733: variable 'ansible_search_path' from source: unknown 44842 1727204499.26768: calling self._execute() 44842 1727204499.26834: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.26837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.26852: variable 'omit' from source: magic vars 44842 1727204499.27112: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.27124: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.27319: variable 'type' from source: set_fact 44842 1727204499.27332: variable 'state' from source: include params 44842 1727204499.27342: variable 'interface' from source: set_fact 44842 1727204499.27350: variable 'current_interfaces' from source: set_fact 44842 1727204499.27366: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 44842 1727204499.27374: when evaluation is False, skipping this task 44842 1727204499.27384: _execute() done 44842 1727204499.27394: dumping result to json 44842 1727204499.27401: done dumping result, returning 44842 1727204499.27410: done running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 [0affcd87-79f5-aad0-d242-0000000001d6] 44842 1727204499.27422: sending task result for task 0affcd87-79f5-aad0-d242-0000000001d6 44842 1727204499.27534: done sending task result for task 0affcd87-79f5-aad0-d242-0000000001d6 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 44842 1727204499.27591: no more pending results, returning what we have 44842 1727204499.27597: results queue empty 44842 1727204499.27598: checking for any_errors_fatal 44842 1727204499.27605: done checking for any_errors_fatal 44842 1727204499.27606: checking for max_fail_percentage 44842 1727204499.27608: done checking for max_fail_percentage 44842 1727204499.27609: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.27610: done checking to see if all hosts have failed 44842 1727204499.27611: getting the remaining hosts for this loop 44842 1727204499.27613: done getting the remaining hosts for this loop 44842 1727204499.27617: getting the next task for host managed-node1 44842 1727204499.27627: done getting next task for host managed-node1 44842 1727204499.27635: ^ task is: TASK: Include the task 'assert_device_present.yml' 44842 1727204499.27638: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.27644: getting variables 44842 1727204499.27646: in VariableManager get_vars() 44842 1727204499.27690: Calling all_inventory to load vars for managed-node1 44842 1727204499.27693: Calling groups_inventory to load vars for managed-node1 44842 1727204499.27695: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.27707: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.27710: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.27713: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.28020: WORKER PROCESS EXITING 44842 1727204499.28048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.28417: done with get_vars() 44842 1727204499.28427: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:20 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.022) 0:00:09.453 ***** 44842 1727204499.28527: entering _queue_task() for managed-node1/include_tasks 44842 1727204499.28769: worker is 1 (out of 1 available) 44842 1727204499.28781: exiting _queue_task() for managed-node1/include_tasks 44842 1727204499.28793: done queuing things up, now waiting for results queue to drain 44842 1727204499.28794: waiting for pending results... 44842 1727204499.29068: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' 44842 1727204499.29171: in run() - task 0affcd87-79f5-aad0-d242-00000000000e 44842 1727204499.29193: variable 'ansible_search_path' from source: unknown 44842 1727204499.29232: calling self._execute() 44842 1727204499.29326: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.29337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.29356: variable 'omit' from source: magic vars 44842 1727204499.29734: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.29750: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.29759: _execute() done 44842 1727204499.29773: dumping result to json 44842 1727204499.29784: done dumping result, returning 44842 1727204499.29793: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-aad0-d242-00000000000e] 44842 1727204499.29803: sending task result for task 0affcd87-79f5-aad0-d242-00000000000e 44842 1727204499.29930: no more pending results, returning what we have 44842 1727204499.29936: in VariableManager get_vars() 44842 1727204499.29984: Calling all_inventory to load vars for managed-node1 44842 1727204499.29987: Calling groups_inventory to load vars for managed-node1 44842 1727204499.29990: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.30003: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.30007: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.30010: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.30614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.30827: done with get_vars() 44842 1727204499.30834: variable 'ansible_search_path' from source: unknown 44842 1727204499.30848: we have included files to process 44842 1727204499.30849: generating all_blocks data 44842 1727204499.30851: done generating all_blocks data 44842 1727204499.30855: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44842 1727204499.30856: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44842 1727204499.30859: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44842 1727204499.30868: done sending task result for task 0affcd87-79f5-aad0-d242-00000000000e 44842 1727204499.30871: WORKER PROCESS EXITING 44842 1727204499.31028: in VariableManager get_vars() 44842 1727204499.31047: done with get_vars() 44842 1727204499.31167: done processing included file 44842 1727204499.31169: iterating over new_blocks loaded from include file 44842 1727204499.31170: in VariableManager get_vars() 44842 1727204499.31190: done with get_vars() 44842 1727204499.31191: filtering new block on tags 44842 1727204499.31209: done filtering new block on tags 44842 1727204499.31211: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 44842 1727204499.31216: extending task lists for all hosts with included blocks 44842 1727204499.35221: done extending task lists 44842 1727204499.35223: done processing included files 44842 1727204499.35224: results queue empty 44842 1727204499.35225: checking for any_errors_fatal 44842 1727204499.35228: done checking for any_errors_fatal 44842 1727204499.35229: checking for max_fail_percentage 44842 1727204499.35230: done checking for max_fail_percentage 44842 1727204499.35231: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.35231: done checking to see if all hosts have failed 44842 1727204499.35232: getting the remaining hosts for this loop 44842 1727204499.35233: done getting the remaining hosts for this loop 44842 1727204499.35236: getting the next task for host managed-node1 44842 1727204499.35240: done getting next task for host managed-node1 44842 1727204499.35243: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44842 1727204499.35245: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.35248: getting variables 44842 1727204499.35249: in VariableManager get_vars() 44842 1727204499.35269: Calling all_inventory to load vars for managed-node1 44842 1727204499.35272: Calling groups_inventory to load vars for managed-node1 44842 1727204499.35274: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.35280: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.35283: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.35285: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.35443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.35669: done with get_vars() 44842 1727204499.35683: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.072) 0:00:09.525 ***** 44842 1727204499.35766: entering _queue_task() for managed-node1/include_tasks 44842 1727204499.36726: worker is 1 (out of 1 available) 44842 1727204499.36737: exiting _queue_task() for managed-node1/include_tasks 44842 1727204499.36748: done queuing things up, now waiting for results queue to drain 44842 1727204499.36750: waiting for pending results... 44842 1727204499.37005: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 44842 1727204499.37113: in run() - task 0affcd87-79f5-aad0-d242-0000000002ec 44842 1727204499.37133: variable 'ansible_search_path' from source: unknown 44842 1727204499.37140: variable 'ansible_search_path' from source: unknown 44842 1727204499.37185: calling self._execute() 44842 1727204499.37279: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.37290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.37308: variable 'omit' from source: magic vars 44842 1727204499.37657: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.37678: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.37689: _execute() done 44842 1727204499.37697: dumping result to json 44842 1727204499.37704: done dumping result, returning 44842 1727204499.37713: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-aad0-d242-0000000002ec] 44842 1727204499.37723: sending task result for task 0affcd87-79f5-aad0-d242-0000000002ec 44842 1727204499.37830: done sending task result for task 0affcd87-79f5-aad0-d242-0000000002ec 44842 1727204499.37837: WORKER PROCESS EXITING 44842 1727204499.37876: no more pending results, returning what we have 44842 1727204499.37881: in VariableManager get_vars() 44842 1727204499.37928: Calling all_inventory to load vars for managed-node1 44842 1727204499.37931: Calling groups_inventory to load vars for managed-node1 44842 1727204499.37934: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.37948: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.37951: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.37954: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.38706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.38914: done with get_vars() 44842 1727204499.38921: variable 'ansible_search_path' from source: unknown 44842 1727204499.38922: variable 'ansible_search_path' from source: unknown 44842 1727204499.38965: we have included files to process 44842 1727204499.38966: generating all_blocks data 44842 1727204499.38968: done generating all_blocks data 44842 1727204499.38968: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44842 1727204499.38969: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44842 1727204499.38972: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44842 1727204499.39234: done processing included file 44842 1727204499.39236: iterating over new_blocks loaded from include file 44842 1727204499.39238: in VariableManager get_vars() 44842 1727204499.39254: done with get_vars() 44842 1727204499.39255: filtering new block on tags 44842 1727204499.39276: done filtering new block on tags 44842 1727204499.39278: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 44842 1727204499.39285: extending task lists for all hosts with included blocks 44842 1727204499.39531: done extending task lists 44842 1727204499.39533: done processing included files 44842 1727204499.39534: results queue empty 44842 1727204499.39534: checking for any_errors_fatal 44842 1727204499.39537: done checking for any_errors_fatal 44842 1727204499.39537: checking for max_fail_percentage 44842 1727204499.39539: done checking for max_fail_percentage 44842 1727204499.39539: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.39540: done checking to see if all hosts have failed 44842 1727204499.39541: getting the remaining hosts for this loop 44842 1727204499.39542: done getting the remaining hosts for this loop 44842 1727204499.39544: getting the next task for host managed-node1 44842 1727204499.39549: done getting next task for host managed-node1 44842 1727204499.39551: ^ task is: TASK: Get stat for interface {{ interface }} 44842 1727204499.39553: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.39556: getting variables 44842 1727204499.39557: in VariableManager get_vars() 44842 1727204499.39571: Calling all_inventory to load vars for managed-node1 44842 1727204499.39574: Calling groups_inventory to load vars for managed-node1 44842 1727204499.39575: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.39580: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.39582: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.39585: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.39730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.39939: done with get_vars() 44842 1727204499.39948: done getting variables 44842 1727204499.40101: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.043) 0:00:09.569 ***** 44842 1727204499.40129: entering _queue_task() for managed-node1/stat 44842 1727204499.40396: worker is 1 (out of 1 available) 44842 1727204499.40408: exiting _queue_task() for managed-node1/stat 44842 1727204499.40419: done queuing things up, now waiting for results queue to drain 44842 1727204499.40421: waiting for pending results... 44842 1727204499.40688: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 44842 1727204499.40815: in run() - task 0affcd87-79f5-aad0-d242-0000000003b5 44842 1727204499.40834: variable 'ansible_search_path' from source: unknown 44842 1727204499.40844: variable 'ansible_search_path' from source: unknown 44842 1727204499.40936: calling self._execute() 44842 1727204499.41031: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.41043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.41055: variable 'omit' from source: magic vars 44842 1727204499.41838: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.41857: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.41874: variable 'omit' from source: magic vars 44842 1727204499.41920: variable 'omit' from source: magic vars 44842 1727204499.42135: variable 'interface' from source: set_fact 44842 1727204499.42162: variable 'omit' from source: magic vars 44842 1727204499.42320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204499.42359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204499.42394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204499.42417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204499.42510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204499.42544: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204499.42553: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.42563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.42784: Set connection var ansible_shell_type to sh 44842 1727204499.42801: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204499.42830: Set connection var ansible_connection to ssh 44842 1727204499.42897: Set connection var ansible_pipelining to False 44842 1727204499.42907: Set connection var ansible_timeout to 10 44842 1727204499.42918: Set connection var ansible_shell_executable to /bin/sh 44842 1727204499.42947: variable 'ansible_shell_executable' from source: unknown 44842 1727204499.42956: variable 'ansible_connection' from source: unknown 44842 1727204499.42970: variable 'ansible_module_compression' from source: unknown 44842 1727204499.42978: variable 'ansible_shell_type' from source: unknown 44842 1727204499.42986: variable 'ansible_shell_executable' from source: unknown 44842 1727204499.42993: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.43001: variable 'ansible_pipelining' from source: unknown 44842 1727204499.43008: variable 'ansible_timeout' from source: unknown 44842 1727204499.43016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.43217: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204499.43233: variable 'omit' from source: magic vars 44842 1727204499.43244: starting attempt loop 44842 1727204499.43255: running the handler 44842 1727204499.43285: _low_level_execute_command(): starting 44842 1727204499.43298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204499.44984: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.45097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.45138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204499.45142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.45145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.45225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.45316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204499.45319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.45395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.46999: stdout chunk (state=3): >>>/root <<< 44842 1727204499.47102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.47190: stderr chunk (state=3): >>><<< 44842 1727204499.47194: stdout chunk (state=3): >>><<< 44842 1727204499.47276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204499.47280: _low_level_execute_command(): starting 44842 1727204499.47283: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299 `" && echo ansible-tmp-1727204499.4721744-45740-84802164689299="` echo /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299 `" ) && sleep 0' 44842 1727204499.47958: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204499.47978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.47994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.48011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.48062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.48086: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204499.48099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.48116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204499.48127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204499.48142: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204499.48159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.48179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.48195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.48208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.48218: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204499.48230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.48316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.48338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204499.48356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.48451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.50324: stdout chunk (state=3): >>>ansible-tmp-1727204499.4721744-45740-84802164689299=/root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299 <<< 44842 1727204499.50440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.50500: stderr chunk (state=3): >>><<< 44842 1727204499.50502: stdout chunk (state=3): >>><<< 44842 1727204499.50518: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204499.4721744-45740-84802164689299=/root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204499.50558: variable 'ansible_module_compression' from source: unknown 44842 1727204499.50608: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44842 1727204499.50643: variable 'ansible_facts' from source: unknown 44842 1727204499.50707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299/AnsiballZ_stat.py 44842 1727204499.50822: Sending initial data 44842 1727204499.50826: Sent initial data (152 bytes) 44842 1727204499.51779: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.51785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.51824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.51847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.51851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.51867: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204499.51873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.51956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.51969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204499.51982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.52071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.53754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204499.53803: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204499.53858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpfr_yaf1r /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299/AnsiballZ_stat.py <<< 44842 1727204499.53906: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204499.54748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.54860: stderr chunk (state=3): >>><<< 44842 1727204499.54864: stdout chunk (state=3): >>><<< 44842 1727204499.54885: done transferring module to remote 44842 1727204499.54894: _low_level_execute_command(): starting 44842 1727204499.54899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299/ /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299/AnsiballZ_stat.py && sleep 0' 44842 1727204499.55346: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.55351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.55400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.55403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.55409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.55459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204499.55471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.55536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.57281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.57287: stderr chunk (state=3): >>><<< 44842 1727204499.57290: stdout chunk (state=3): >>><<< 44842 1727204499.57308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204499.57312: _low_level_execute_command(): starting 44842 1727204499.57316: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299/AnsiballZ_stat.py && sleep 0' 44842 1727204499.57812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.57818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.57869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.57873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.57875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.57924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.57928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.57995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.71046: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29789, "dev": 21, "nlink": 1, "atime": 1727204497.7714543, "mtime": 1727204497.7714543, "ctime": 1727204497.7714543, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44842 1727204499.71913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204499.71970: stderr chunk (state=3): >>><<< 44842 1727204499.71973: stdout chunk (state=3): >>><<< 44842 1727204499.71993: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29789, "dev": 21, "nlink": 1, "atime": 1727204497.7714543, "mtime": 1727204497.7714543, "ctime": 1727204497.7714543, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204499.72034: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204499.72043: _low_level_execute_command(): starting 44842 1727204499.72046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204499.4721744-45740-84802164689299/ > /dev/null 2>&1 && sleep 0' 44842 1727204499.72508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.72512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.72549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204499.72562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.72575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.72619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.72641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.72691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.74418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.74474: stderr chunk (state=3): >>><<< 44842 1727204499.74478: stdout chunk (state=3): >>><<< 44842 1727204499.74492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204499.74497: handler run complete 44842 1727204499.74528: attempt loop complete, returning result 44842 1727204499.74531: _execute() done 44842 1727204499.74533: dumping result to json 44842 1727204499.74539: done dumping result, returning 44842 1727204499.74546: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [0affcd87-79f5-aad0-d242-0000000003b5] 44842 1727204499.74551: sending task result for task 0affcd87-79f5-aad0-d242-0000000003b5 44842 1727204499.74656: done sending task result for task 0affcd87-79f5-aad0-d242-0000000003b5 44842 1727204499.74658: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204497.7714543, "block_size": 4096, "blocks": 0, "ctime": 1727204497.7714543, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29789, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204497.7714543, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 44842 1727204499.74836: no more pending results, returning what we have 44842 1727204499.74840: results queue empty 44842 1727204499.74841: checking for any_errors_fatal 44842 1727204499.74842: done checking for any_errors_fatal 44842 1727204499.74843: checking for max_fail_percentage 44842 1727204499.74844: done checking for max_fail_percentage 44842 1727204499.74845: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.74846: done checking to see if all hosts have failed 44842 1727204499.74846: getting the remaining hosts for this loop 44842 1727204499.74848: done getting the remaining hosts for this loop 44842 1727204499.74851: getting the next task for host managed-node1 44842 1727204499.74857: done getting next task for host managed-node1 44842 1727204499.74859: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 44842 1727204499.74862: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.74866: getting variables 44842 1727204499.74867: in VariableManager get_vars() 44842 1727204499.74890: Calling all_inventory to load vars for managed-node1 44842 1727204499.74892: Calling groups_inventory to load vars for managed-node1 44842 1727204499.74893: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.74900: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.74902: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.74904: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.75013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.75144: done with get_vars() 44842 1727204499.75152: done getting variables 44842 1727204499.75225: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 44842 1727204499.75315: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.352) 0:00:09.921 ***** 44842 1727204499.75336: entering _queue_task() for managed-node1/assert 44842 1727204499.75338: Creating lock for assert 44842 1727204499.75529: worker is 1 (out of 1 available) 44842 1727204499.75541: exiting _queue_task() for managed-node1/assert 44842 1727204499.75553: done queuing things up, now waiting for results queue to drain 44842 1727204499.75554: waiting for pending results... 44842 1727204499.75721: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' 44842 1727204499.75784: in run() - task 0affcd87-79f5-aad0-d242-0000000002ed 44842 1727204499.75795: variable 'ansible_search_path' from source: unknown 44842 1727204499.75798: variable 'ansible_search_path' from source: unknown 44842 1727204499.75825: calling self._execute() 44842 1727204499.75893: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.75897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.75905: variable 'omit' from source: magic vars 44842 1727204499.76161: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.76176: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.76181: variable 'omit' from source: magic vars 44842 1727204499.76212: variable 'omit' from source: magic vars 44842 1727204499.76280: variable 'interface' from source: set_fact 44842 1727204499.76293: variable 'omit' from source: magic vars 44842 1727204499.76329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204499.76355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204499.76377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204499.76389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204499.76402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204499.76424: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204499.76427: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.76431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.76499: Set connection var ansible_shell_type to sh 44842 1727204499.76507: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204499.76516: Set connection var ansible_connection to ssh 44842 1727204499.76521: Set connection var ansible_pipelining to False 44842 1727204499.76526: Set connection var ansible_timeout to 10 44842 1727204499.76533: Set connection var ansible_shell_executable to /bin/sh 44842 1727204499.76552: variable 'ansible_shell_executable' from source: unknown 44842 1727204499.76555: variable 'ansible_connection' from source: unknown 44842 1727204499.76558: variable 'ansible_module_compression' from source: unknown 44842 1727204499.76562: variable 'ansible_shell_type' from source: unknown 44842 1727204499.76567: variable 'ansible_shell_executable' from source: unknown 44842 1727204499.76570: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.76572: variable 'ansible_pipelining' from source: unknown 44842 1727204499.76574: variable 'ansible_timeout' from source: unknown 44842 1727204499.76576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.76680: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204499.76688: variable 'omit' from source: magic vars 44842 1727204499.76693: starting attempt loop 44842 1727204499.76696: running the handler 44842 1727204499.76789: variable 'interface_stat' from source: set_fact 44842 1727204499.76803: Evaluated conditional (interface_stat.stat.exists): True 44842 1727204499.76809: handler run complete 44842 1727204499.76819: attempt loop complete, returning result 44842 1727204499.76822: _execute() done 44842 1727204499.76825: dumping result to json 44842 1727204499.76827: done dumping result, returning 44842 1727204499.76834: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' [0affcd87-79f5-aad0-d242-0000000002ed] 44842 1727204499.76844: sending task result for task 0affcd87-79f5-aad0-d242-0000000002ed 44842 1727204499.76921: done sending task result for task 0affcd87-79f5-aad0-d242-0000000002ed 44842 1727204499.76924: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204499.76969: no more pending results, returning what we have 44842 1727204499.76972: results queue empty 44842 1727204499.76974: checking for any_errors_fatal 44842 1727204499.76980: done checking for any_errors_fatal 44842 1727204499.76981: checking for max_fail_percentage 44842 1727204499.76982: done checking for max_fail_percentage 44842 1727204499.76983: checking to see if all hosts have failed and the running result is not ok 44842 1727204499.76984: done checking to see if all hosts have failed 44842 1727204499.76985: getting the remaining hosts for this loop 44842 1727204499.76987: done getting the remaining hosts for this loop 44842 1727204499.76990: getting the next task for host managed-node1 44842 1727204499.76996: done getting next task for host managed-node1 44842 1727204499.76999: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 44842 1727204499.77000: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204499.77003: getting variables 44842 1727204499.77005: in VariableManager get_vars() 44842 1727204499.77035: Calling all_inventory to load vars for managed-node1 44842 1727204499.77037: Calling groups_inventory to load vars for managed-node1 44842 1727204499.77039: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204499.77047: Calling all_plugins_play to load vars for managed-node1 44842 1727204499.77049: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204499.77052: Calling groups_plugins_play to load vars for managed-node1 44842 1727204499.77174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204499.77322: done with get_vars() 44842 1727204499.77330: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:23 Tuesday 24 September 2024 15:01:39 -0400 (0:00:00.020) 0:00:09.942 ***** 44842 1727204499.77407: entering _queue_task() for managed-node1/lineinfile 44842 1727204499.77408: Creating lock for lineinfile 44842 1727204499.77584: worker is 1 (out of 1 available) 44842 1727204499.77600: exiting _queue_task() for managed-node1/lineinfile 44842 1727204499.77611: done queuing things up, now waiting for results queue to drain 44842 1727204499.77613: waiting for pending results... 44842 1727204499.77769: running TaskExecutor() for managed-node1/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 44842 1727204499.77816: in run() - task 0affcd87-79f5-aad0-d242-00000000000f 44842 1727204499.77832: variable 'ansible_search_path' from source: unknown 44842 1727204499.77860: calling self._execute() 44842 1727204499.77923: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.77933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.77942: variable 'omit' from source: magic vars 44842 1727204499.78200: variable 'ansible_distribution_major_version' from source: facts 44842 1727204499.78210: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204499.78216: variable 'omit' from source: magic vars 44842 1727204499.78231: variable 'omit' from source: magic vars 44842 1727204499.78257: variable 'omit' from source: magic vars 44842 1727204499.78291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204499.78316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204499.78331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204499.78344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204499.78354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204499.78381: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204499.78387: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.78390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.78455: Set connection var ansible_shell_type to sh 44842 1727204499.78467: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204499.78475: Set connection var ansible_connection to ssh 44842 1727204499.78481: Set connection var ansible_pipelining to False 44842 1727204499.78486: Set connection var ansible_timeout to 10 44842 1727204499.78493: Set connection var ansible_shell_executable to /bin/sh 44842 1727204499.78550: variable 'ansible_shell_executable' from source: unknown 44842 1727204499.78553: variable 'ansible_connection' from source: unknown 44842 1727204499.78555: variable 'ansible_module_compression' from source: unknown 44842 1727204499.78557: variable 'ansible_shell_type' from source: unknown 44842 1727204499.78559: variable 'ansible_shell_executable' from source: unknown 44842 1727204499.78560: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204499.78562: variable 'ansible_pipelining' from source: unknown 44842 1727204499.78566: variable 'ansible_timeout' from source: unknown 44842 1727204499.78571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204499.78773: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204499.78788: variable 'omit' from source: magic vars 44842 1727204499.78809: starting attempt loop 44842 1727204499.78816: running the handler 44842 1727204499.78834: _low_level_execute_command(): starting 44842 1727204499.78848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204499.79609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204499.79624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.79638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.79656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.79708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.79720: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204499.79733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.79751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204499.79767: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204499.79780: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204499.79804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.79820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.79837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.79850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.79862: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204499.79880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.79970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.80009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.80061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.81585: stdout chunk (state=3): >>>/root <<< 44842 1727204499.81692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.81737: stderr chunk (state=3): >>><<< 44842 1727204499.81745: stdout chunk (state=3): >>><<< 44842 1727204499.81774: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204499.81794: _low_level_execute_command(): starting 44842 1727204499.81797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629 `" && echo ansible-tmp-1727204499.817718-45762-131711751345629="` echo /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629 `" ) && sleep 0' 44842 1727204499.82409: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204499.82432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.82457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.82479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.82521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.82534: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204499.82547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.82576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204499.82590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204499.82601: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204499.82612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204499.82625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204499.82639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204499.82650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204499.82661: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204499.82688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204499.82762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204499.82792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204499.82809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204499.82903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204499.84719: stdout chunk (state=3): >>>ansible-tmp-1727204499.817718-45762-131711751345629=/root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629 <<< 44842 1727204499.84877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204499.84915: stderr chunk (state=3): >>><<< 44842 1727204499.84918: stdout chunk (state=3): >>><<< 44842 1727204499.85175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204499.817718-45762-131711751345629=/root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204499.85178: variable 'ansible_module_compression' from source: unknown 44842 1727204499.85181: ANSIBALLZ: Using lock for lineinfile 44842 1727204499.85183: ANSIBALLZ: Acquiring lock 44842 1727204499.85185: ANSIBALLZ: Lock acquired: 140164879307200 44842 1727204499.85187: ANSIBALLZ: Creating module 44842 1727204500.00488: ANSIBALLZ: Writing module into payload 44842 1727204500.00650: ANSIBALLZ: Writing module 44842 1727204500.00686: ANSIBALLZ: Renaming module 44842 1727204500.00697: ANSIBALLZ: Done creating module 44842 1727204500.00728: variable 'ansible_facts' from source: unknown 44842 1727204500.00805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629/AnsiballZ_lineinfile.py 44842 1727204500.00975: Sending initial data 44842 1727204500.00978: Sent initial data (158 bytes) 44842 1727204500.02007: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204500.02028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.02044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.02063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.02106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.02121: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204500.02143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.02161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204500.02176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204500.02186: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204500.02198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.02210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.02224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.02239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.02253: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204500.02269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.02346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.02376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.02392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.02485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.04199: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204500.04246: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204500.04302: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpbqjra_gf /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629/AnsiballZ_lineinfile.py <<< 44842 1727204500.04349: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204500.05757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204500.05861: stderr chunk (state=3): >>><<< 44842 1727204500.05869: stdout chunk (state=3): >>><<< 44842 1727204500.05891: done transferring module to remote 44842 1727204500.05902: _low_level_execute_command(): starting 44842 1727204500.05907: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629/ /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629/AnsiballZ_lineinfile.py && sleep 0' 44842 1727204500.06515: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204500.06524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.06534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.06548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.06589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.06596: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204500.06605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.06618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204500.06625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204500.06632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204500.06639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.06648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.06660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.06673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.06680: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204500.06690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.06761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.06783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.06791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.06874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.08573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204500.08636: stderr chunk (state=3): >>><<< 44842 1727204500.08639: stdout chunk (state=3): >>><<< 44842 1727204500.08654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204500.08659: _low_level_execute_command(): starting 44842 1727204500.08671: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629/AnsiballZ_lineinfile.py && sleep 0' 44842 1727204500.09267: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204500.09280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.09290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.09303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.09339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.09346: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204500.09355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.09375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204500.09381: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204500.09388: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204500.09396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.09405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.09416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.09423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.09429: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204500.09438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.09513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.09527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.09538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.09745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.23374: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 44842 1727204500.24397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204500.24401: stdout chunk (state=3): >>><<< 44842 1727204500.24406: stderr chunk (state=3): >>><<< 44842 1727204500.24427: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204500.24475: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204500.24484: _low_level_execute_command(): starting 44842 1727204500.24488: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204499.817718-45762-131711751345629/ > /dev/null 2>&1 && sleep 0' 44842 1727204500.25131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204500.25140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.25151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.25168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.25208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.25215: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204500.25225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.25239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204500.25247: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204500.25254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204500.25262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.25279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.25290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.25295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.25303: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204500.25312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.25405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.25414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.25417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.25502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.27281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204500.27365: stderr chunk (state=3): >>><<< 44842 1727204500.27371: stdout chunk (state=3): >>><<< 44842 1727204500.27390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204500.27396: handler run complete 44842 1727204500.27425: attempt loop complete, returning result 44842 1727204500.27428: _execute() done 44842 1727204500.27430: dumping result to json 44842 1727204500.27436: done dumping result, returning 44842 1727204500.27445: done running TaskExecutor() for managed-node1/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [0affcd87-79f5-aad0-d242-00000000000f] 44842 1727204500.27451: sending task result for task 0affcd87-79f5-aad0-d242-00000000000f 44842 1727204500.27565: done sending task result for task 0affcd87-79f5-aad0-d242-00000000000f 44842 1727204500.27568: WORKER PROCESS EXITING changed: [managed-node1] => { "backup": "", "changed": true } MSG: line added 44842 1727204500.27692: no more pending results, returning what we have 44842 1727204500.27697: results queue empty 44842 1727204500.27698: checking for any_errors_fatal 44842 1727204500.27704: done checking for any_errors_fatal 44842 1727204500.27705: checking for max_fail_percentage 44842 1727204500.27706: done checking for max_fail_percentage 44842 1727204500.27707: checking to see if all hosts have failed and the running result is not ok 44842 1727204500.27708: done checking to see if all hosts have failed 44842 1727204500.27708: getting the remaining hosts for this loop 44842 1727204500.27711: done getting the remaining hosts for this loop 44842 1727204500.27715: getting the next task for host managed-node1 44842 1727204500.27723: done getting next task for host managed-node1 44842 1727204500.27728: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44842 1727204500.27731: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204500.27747: getting variables 44842 1727204500.27749: in VariableManager get_vars() 44842 1727204500.27790: Calling all_inventory to load vars for managed-node1 44842 1727204500.27794: Calling groups_inventory to load vars for managed-node1 44842 1727204500.27796: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204500.27806: Calling all_plugins_play to load vars for managed-node1 44842 1727204500.27809: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204500.27812: Calling groups_plugins_play to load vars for managed-node1 44842 1727204500.28000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204500.28231: done with get_vars() 44842 1727204500.28244: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:01:40 -0400 (0:00:00.510) 0:00:10.452 ***** 44842 1727204500.28467: entering _queue_task() for managed-node1/include_tasks 44842 1727204500.29028: worker is 1 (out of 1 available) 44842 1727204500.29040: exiting _queue_task() for managed-node1/include_tasks 44842 1727204500.29052: done queuing things up, now waiting for results queue to drain 44842 1727204500.29060: waiting for pending results... 44842 1727204500.29353: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44842 1727204500.29504: in run() - task 0affcd87-79f5-aad0-d242-000000000017 44842 1727204500.29529: variable 'ansible_search_path' from source: unknown 44842 1727204500.29537: variable 'ansible_search_path' from source: unknown 44842 1727204500.29584: calling self._execute() 44842 1727204500.29699: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204500.29712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204500.29759: variable 'omit' from source: magic vars 44842 1727204500.30456: variable 'ansible_distribution_major_version' from source: facts 44842 1727204500.30480: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204500.30493: _execute() done 44842 1727204500.30502: dumping result to json 44842 1727204500.30510: done dumping result, returning 44842 1727204500.30521: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-aad0-d242-000000000017] 44842 1727204500.30533: sending task result for task 0affcd87-79f5-aad0-d242-000000000017 44842 1727204500.30649: done sending task result for task 0affcd87-79f5-aad0-d242-000000000017 44842 1727204500.30658: WORKER PROCESS EXITING 44842 1727204500.30708: no more pending results, returning what we have 44842 1727204500.30715: in VariableManager get_vars() 44842 1727204500.30757: Calling all_inventory to load vars for managed-node1 44842 1727204500.30761: Calling groups_inventory to load vars for managed-node1 44842 1727204500.30768: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204500.30782: Calling all_plugins_play to load vars for managed-node1 44842 1727204500.30785: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204500.30788: Calling groups_plugins_play to load vars for managed-node1 44842 1727204500.31039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204500.31234: done with get_vars() 44842 1727204500.31244: variable 'ansible_search_path' from source: unknown 44842 1727204500.31245: variable 'ansible_search_path' from source: unknown 44842 1727204500.31847: we have included files to process 44842 1727204500.31849: generating all_blocks data 44842 1727204500.31850: done generating all_blocks data 44842 1727204500.31854: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204500.31855: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204500.31857: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204500.32894: done processing included file 44842 1727204500.32896: iterating over new_blocks loaded from include file 44842 1727204500.32897: in VariableManager get_vars() 44842 1727204500.33032: done with get_vars() 44842 1727204500.33035: filtering new block on tags 44842 1727204500.33053: done filtering new block on tags 44842 1727204500.33055: in VariableManager get_vars() 44842 1727204500.33077: done with get_vars() 44842 1727204500.33079: filtering new block on tags 44842 1727204500.33099: done filtering new block on tags 44842 1727204500.33102: in VariableManager get_vars() 44842 1727204500.33121: done with get_vars() 44842 1727204500.33123: filtering new block on tags 44842 1727204500.33255: done filtering new block on tags 44842 1727204500.33257: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 44842 1727204500.33263: extending task lists for all hosts with included blocks 44842 1727204500.34587: done extending task lists 44842 1727204500.34589: done processing included files 44842 1727204500.34590: results queue empty 44842 1727204500.34590: checking for any_errors_fatal 44842 1727204500.34595: done checking for any_errors_fatal 44842 1727204500.34595: checking for max_fail_percentage 44842 1727204500.34596: done checking for max_fail_percentage 44842 1727204500.34597: checking to see if all hosts have failed and the running result is not ok 44842 1727204500.34598: done checking to see if all hosts have failed 44842 1727204500.34599: getting the remaining hosts for this loop 44842 1727204500.34600: done getting the remaining hosts for this loop 44842 1727204500.34603: getting the next task for host managed-node1 44842 1727204500.34607: done getting next task for host managed-node1 44842 1727204500.34610: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44842 1727204500.34613: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204500.34622: getting variables 44842 1727204500.34623: in VariableManager get_vars() 44842 1727204500.34636: Calling all_inventory to load vars for managed-node1 44842 1727204500.34639: Calling groups_inventory to load vars for managed-node1 44842 1727204500.34641: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204500.34645: Calling all_plugins_play to load vars for managed-node1 44842 1727204500.34648: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204500.34655: Calling groups_plugins_play to load vars for managed-node1 44842 1727204500.34855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204500.35073: done with get_vars() 44842 1727204500.35087: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:01:40 -0400 (0:00:00.066) 0:00:10.519 ***** 44842 1727204500.35154: entering _queue_task() for managed-node1/setup 44842 1727204500.35468: worker is 1 (out of 1 available) 44842 1727204500.35479: exiting _queue_task() for managed-node1/setup 44842 1727204500.35491: done queuing things up, now waiting for results queue to drain 44842 1727204500.35492: waiting for pending results... 44842 1727204500.35817: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44842 1727204500.35983: in run() - task 0affcd87-79f5-aad0-d242-0000000003d0 44842 1727204500.36006: variable 'ansible_search_path' from source: unknown 44842 1727204500.36014: variable 'ansible_search_path' from source: unknown 44842 1727204500.36061: calling self._execute() 44842 1727204500.36149: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204500.36162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204500.36184: variable 'omit' from source: magic vars 44842 1727204500.36582: variable 'ansible_distribution_major_version' from source: facts 44842 1727204500.36605: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204500.36846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204500.39560: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204500.39642: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204500.39686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204500.39736: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204500.39771: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204500.39872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204500.39908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204500.39949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204500.39998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204500.40018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204500.40088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204500.40116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204500.40150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204500.40200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204500.40218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204500.40394: variable '__network_required_facts' from source: role '' defaults 44842 1727204500.40409: variable 'ansible_facts' from source: unknown 44842 1727204500.40516: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44842 1727204500.40523: when evaluation is False, skipping this task 44842 1727204500.40528: _execute() done 44842 1727204500.40533: dumping result to json 44842 1727204500.40538: done dumping result, returning 44842 1727204500.40547: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-aad0-d242-0000000003d0] 44842 1727204500.40554: sending task result for task 0affcd87-79f5-aad0-d242-0000000003d0 44842 1727204500.40657: done sending task result for task 0affcd87-79f5-aad0-d242-0000000003d0 44842 1727204500.40666: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204500.40725: no more pending results, returning what we have 44842 1727204500.40729: results queue empty 44842 1727204500.40730: checking for any_errors_fatal 44842 1727204500.40732: done checking for any_errors_fatal 44842 1727204500.40732: checking for max_fail_percentage 44842 1727204500.40734: done checking for max_fail_percentage 44842 1727204500.40735: checking to see if all hosts have failed and the running result is not ok 44842 1727204500.40736: done checking to see if all hosts have failed 44842 1727204500.40736: getting the remaining hosts for this loop 44842 1727204500.40738: done getting the remaining hosts for this loop 44842 1727204500.40742: getting the next task for host managed-node1 44842 1727204500.40752: done getting next task for host managed-node1 44842 1727204500.40755: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44842 1727204500.40760: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204500.40776: getting variables 44842 1727204500.40778: in VariableManager get_vars() 44842 1727204500.40819: Calling all_inventory to load vars for managed-node1 44842 1727204500.40822: Calling groups_inventory to load vars for managed-node1 44842 1727204500.40825: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204500.40835: Calling all_plugins_play to load vars for managed-node1 44842 1727204500.40838: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204500.40842: Calling groups_plugins_play to load vars for managed-node1 44842 1727204500.41038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204500.41326: done with get_vars() 44842 1727204500.41339: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:01:40 -0400 (0:00:00.064) 0:00:10.583 ***** 44842 1727204500.41588: entering _queue_task() for managed-node1/stat 44842 1727204500.42044: worker is 1 (out of 1 available) 44842 1727204500.42056: exiting _queue_task() for managed-node1/stat 44842 1727204500.42073: done queuing things up, now waiting for results queue to drain 44842 1727204500.42074: waiting for pending results... 44842 1727204500.42367: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 44842 1727204500.42526: in run() - task 0affcd87-79f5-aad0-d242-0000000003d2 44842 1727204500.42545: variable 'ansible_search_path' from source: unknown 44842 1727204500.42553: variable 'ansible_search_path' from source: unknown 44842 1727204500.42601: calling self._execute() 44842 1727204500.42691: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204500.42706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204500.42720: variable 'omit' from source: magic vars 44842 1727204500.43109: variable 'ansible_distribution_major_version' from source: facts 44842 1727204500.43127: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204500.43322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204500.43629: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204500.43687: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204500.43729: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204500.43772: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204500.43867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204500.43905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204500.43942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204500.43979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204500.44081: variable '__network_is_ostree' from source: set_fact 44842 1727204500.44093: Evaluated conditional (not __network_is_ostree is defined): False 44842 1727204500.44100: when evaluation is False, skipping this task 44842 1727204500.44107: _execute() done 44842 1727204500.44118: dumping result to json 44842 1727204500.44124: done dumping result, returning 44842 1727204500.44134: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-aad0-d242-0000000003d2] 44842 1727204500.44146: sending task result for task 0affcd87-79f5-aad0-d242-0000000003d2 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44842 1727204500.44302: no more pending results, returning what we have 44842 1727204500.44306: results queue empty 44842 1727204500.44307: checking for any_errors_fatal 44842 1727204500.44315: done checking for any_errors_fatal 44842 1727204500.44316: checking for max_fail_percentage 44842 1727204500.44317: done checking for max_fail_percentage 44842 1727204500.44318: checking to see if all hosts have failed and the running result is not ok 44842 1727204500.44319: done checking to see if all hosts have failed 44842 1727204500.44320: getting the remaining hosts for this loop 44842 1727204500.44322: done getting the remaining hosts for this loop 44842 1727204500.44326: getting the next task for host managed-node1 44842 1727204500.44334: done getting next task for host managed-node1 44842 1727204500.44338: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44842 1727204500.44342: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204500.44358: getting variables 44842 1727204500.44362: in VariableManager get_vars() 44842 1727204500.44402: Calling all_inventory to load vars for managed-node1 44842 1727204500.44405: Calling groups_inventory to load vars for managed-node1 44842 1727204500.44407: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204500.44417: Calling all_plugins_play to load vars for managed-node1 44842 1727204500.44419: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204500.44422: Calling groups_plugins_play to load vars for managed-node1 44842 1727204500.44622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204500.44875: done with get_vars() 44842 1727204500.45002: done getting variables 44842 1727204500.45034: done sending task result for task 0affcd87-79f5-aad0-d242-0000000003d2 44842 1727204500.45036: WORKER PROCESS EXITING 44842 1727204500.45075: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:01:40 -0400 (0:00:00.036) 0:00:10.620 ***** 44842 1727204500.45227: entering _queue_task() for managed-node1/set_fact 44842 1727204500.45544: worker is 1 (out of 1 available) 44842 1727204500.45557: exiting _queue_task() for managed-node1/set_fact 44842 1727204500.45573: done queuing things up, now waiting for results queue to drain 44842 1727204500.45574: waiting for pending results... 44842 1727204500.45845: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44842 1727204500.46002: in run() - task 0affcd87-79f5-aad0-d242-0000000003d3 44842 1727204500.46027: variable 'ansible_search_path' from source: unknown 44842 1727204500.46034: variable 'ansible_search_path' from source: unknown 44842 1727204500.46080: calling self._execute() 44842 1727204500.46172: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204500.46183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204500.46201: variable 'omit' from source: magic vars 44842 1727204500.46805: variable 'ansible_distribution_major_version' from source: facts 44842 1727204500.46869: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204500.47244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204500.47898: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204500.48028: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204500.48082: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204500.48122: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204500.48220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204500.48250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204500.48291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204500.48397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204500.48696: variable '__network_is_ostree' from source: set_fact 44842 1727204500.48715: Evaluated conditional (not __network_is_ostree is defined): False 44842 1727204500.48722: when evaluation is False, skipping this task 44842 1727204500.48823: _execute() done 44842 1727204500.48831: dumping result to json 44842 1727204500.48839: done dumping result, returning 44842 1727204500.48863: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-aad0-d242-0000000003d3] 44842 1727204500.48936: sending task result for task 0affcd87-79f5-aad0-d242-0000000003d3 44842 1727204500.49056: done sending task result for task 0affcd87-79f5-aad0-d242-0000000003d3 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44842 1727204500.49115: no more pending results, returning what we have 44842 1727204500.49120: results queue empty 44842 1727204500.49122: checking for any_errors_fatal 44842 1727204500.49130: done checking for any_errors_fatal 44842 1727204500.49131: checking for max_fail_percentage 44842 1727204500.49132: done checking for max_fail_percentage 44842 1727204500.49134: checking to see if all hosts have failed and the running result is not ok 44842 1727204500.49135: done checking to see if all hosts have failed 44842 1727204500.49136: getting the remaining hosts for this loop 44842 1727204500.49138: done getting the remaining hosts for this loop 44842 1727204500.49143: getting the next task for host managed-node1 44842 1727204500.49157: done getting next task for host managed-node1 44842 1727204500.49166: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44842 1727204500.49171: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204500.49186: getting variables 44842 1727204500.49189: in VariableManager get_vars() 44842 1727204500.49230: Calling all_inventory to load vars for managed-node1 44842 1727204500.49234: Calling groups_inventory to load vars for managed-node1 44842 1727204500.49236: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204500.49248: Calling all_plugins_play to load vars for managed-node1 44842 1727204500.49251: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204500.49254: Calling groups_plugins_play to load vars for managed-node1 44842 1727204500.49527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204500.49997: done with get_vars() 44842 1727204500.50010: done getting variables 44842 1727204500.50277: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:01:40 -0400 (0:00:00.051) 0:00:10.671 ***** 44842 1727204500.50353: entering _queue_task() for managed-node1/service_facts 44842 1727204500.50355: Creating lock for service_facts 44842 1727204500.51028: worker is 1 (out of 1 available) 44842 1727204500.51041: exiting _queue_task() for managed-node1/service_facts 44842 1727204500.51056: done queuing things up, now waiting for results queue to drain 44842 1727204500.51057: waiting for pending results... 44842 1727204500.51443: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 44842 1727204500.51849: in run() - task 0affcd87-79f5-aad0-d242-0000000003d5 44842 1727204500.51915: variable 'ansible_search_path' from source: unknown 44842 1727204500.51947: variable 'ansible_search_path' from source: unknown 44842 1727204500.51993: calling self._execute() 44842 1727204500.52268: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204500.52285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204500.52343: variable 'omit' from source: magic vars 44842 1727204500.53111: variable 'ansible_distribution_major_version' from source: facts 44842 1727204500.53130: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204500.53200: variable 'omit' from source: magic vars 44842 1727204500.53359: variable 'omit' from source: magic vars 44842 1727204500.53406: variable 'omit' from source: magic vars 44842 1727204500.53489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204500.53528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204500.53557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204500.53586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204500.53603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204500.53635: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204500.53646: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204500.53659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204500.53763: Set connection var ansible_shell_type to sh 44842 1727204500.53788: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204500.53798: Set connection var ansible_connection to ssh 44842 1727204500.53812: Set connection var ansible_pipelining to False 44842 1727204500.53822: Set connection var ansible_timeout to 10 44842 1727204500.53832: Set connection var ansible_shell_executable to /bin/sh 44842 1727204500.53858: variable 'ansible_shell_executable' from source: unknown 44842 1727204500.53871: variable 'ansible_connection' from source: unknown 44842 1727204500.53885: variable 'ansible_module_compression' from source: unknown 44842 1727204500.53923: variable 'ansible_shell_type' from source: unknown 44842 1727204500.53931: variable 'ansible_shell_executable' from source: unknown 44842 1727204500.53937: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204500.53943: variable 'ansible_pipelining' from source: unknown 44842 1727204500.53950: variable 'ansible_timeout' from source: unknown 44842 1727204500.53957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204500.54266: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204500.54283: variable 'omit' from source: magic vars 44842 1727204500.54292: starting attempt loop 44842 1727204500.54300: running the handler 44842 1727204500.54322: _low_level_execute_command(): starting 44842 1727204500.54335: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204500.55110: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204500.55127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.55142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.55165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.55214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.55231: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204500.55247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.55272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204500.55285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204500.55296: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204500.55311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.55326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.55346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.55359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.55378: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204500.55393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.55482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.55506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.55526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.55619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.57174: stdout chunk (state=3): >>>/root <<< 44842 1727204500.57385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204500.57388: stdout chunk (state=3): >>><<< 44842 1727204500.57391: stderr chunk (state=3): >>><<< 44842 1727204500.57517: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204500.57520: _low_level_execute_command(): starting 44842 1727204500.57524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437 `" && echo ansible-tmp-1727204500.574122-45795-227463553439437="` echo /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437 `" ) && sleep 0' 44842 1727204500.58097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204500.58111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.58126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.58145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.58197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.58209: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204500.58224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.58251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204500.58270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204500.58285: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204500.58298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.58312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.58342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.58434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.58448: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204500.58468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.58547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.58575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.58591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.58683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.60519: stdout chunk (state=3): >>>ansible-tmp-1727204500.574122-45795-227463553439437=/root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437 <<< 44842 1727204500.60720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204500.60724: stdout chunk (state=3): >>><<< 44842 1727204500.60727: stderr chunk (state=3): >>><<< 44842 1727204500.60770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204500.574122-45795-227463553439437=/root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204500.60972: variable 'ansible_module_compression' from source: unknown 44842 1727204500.60975: ANSIBALLZ: Using lock for service_facts 44842 1727204500.60977: ANSIBALLZ: Acquiring lock 44842 1727204500.60979: ANSIBALLZ: Lock acquired: 140164877932144 44842 1727204500.60981: ANSIBALLZ: Creating module 44842 1727204500.75916: ANSIBALLZ: Writing module into payload 44842 1727204500.75997: ANSIBALLZ: Writing module 44842 1727204500.76015: ANSIBALLZ: Renaming module 44842 1727204500.76018: ANSIBALLZ: Done creating module 44842 1727204500.76035: variable 'ansible_facts' from source: unknown 44842 1727204500.76084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437/AnsiballZ_service_facts.py 44842 1727204500.76195: Sending initial data 44842 1727204500.76199: Sent initial data (161 bytes) 44842 1727204500.76858: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.76867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.76900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204500.76905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.76908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204500.76910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.76959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.76970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.76972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.77028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.78702: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204500.78755: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204500.78806: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmptpck_0t6 /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437/AnsiballZ_service_facts.py <<< 44842 1727204500.78855: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204500.79713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204500.79823: stderr chunk (state=3): >>><<< 44842 1727204500.79827: stdout chunk (state=3): >>><<< 44842 1727204500.79842: done transferring module to remote 44842 1727204500.79851: _low_level_execute_command(): starting 44842 1727204500.79856: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437/ /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437/AnsiballZ_service_facts.py && sleep 0' 44842 1727204500.80295: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204500.80300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.80328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204500.80336: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204500.80344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.80369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204500.80377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.80419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204500.80432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.80443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.80506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204500.82199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204500.82243: stderr chunk (state=3): >>><<< 44842 1727204500.82248: stdout chunk (state=3): >>><<< 44842 1727204500.82268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204500.82274: _low_level_execute_command(): starting 44842 1727204500.82279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437/AnsiballZ_service_facts.py && sleep 0' 44842 1727204500.82721: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204500.82734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204500.82745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204500.82757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204500.82770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204500.82822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204500.82827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204500.82896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204502.12315: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 44842 1727204502.12338: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 44842 1727204502.12342: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 44842 1727204502.12347: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-up<<< 44842 1727204502.12388: stdout chunk (state=3): >>>date.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hi<<< 44842 1727204502.12397: stdout chunk (state=3): >>>bernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44842 1727204502.13682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204502.13735: stderr chunk (state=3): >>><<< 44842 1727204502.13739: stdout chunk (state=3): >>><<< 44842 1727204502.13878: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204502.15805: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204502.15822: _low_level_execute_command(): starting 44842 1727204502.15841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204500.574122-45795-227463553439437/ > /dev/null 2>&1 && sleep 0' 44842 1727204502.16419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.16425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204502.16470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.16473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.16482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.16525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204502.16532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204502.16542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204502.16609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204502.18381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204502.18526: stderr chunk (state=3): >>><<< 44842 1727204502.18531: stdout chunk (state=3): >>><<< 44842 1727204502.18546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204502.18552: handler run complete 44842 1727204502.18720: variable 'ansible_facts' from source: unknown 44842 1727204502.18809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204502.19058: variable 'ansible_facts' from source: unknown 44842 1727204502.19133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204502.19240: attempt loop complete, returning result 44842 1727204502.19244: _execute() done 44842 1727204502.19246: dumping result to json 44842 1727204502.19282: done dumping result, returning 44842 1727204502.19292: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-aad0-d242-0000000003d5] 44842 1727204502.19297: sending task result for task 0affcd87-79f5-aad0-d242-0000000003d5 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204502.19831: no more pending results, returning what we have 44842 1727204502.19834: results queue empty 44842 1727204502.19835: checking for any_errors_fatal 44842 1727204502.19840: done checking for any_errors_fatal 44842 1727204502.19841: checking for max_fail_percentage 44842 1727204502.19842: done checking for max_fail_percentage 44842 1727204502.19843: checking to see if all hosts have failed and the running result is not ok 44842 1727204502.19844: done checking to see if all hosts have failed 44842 1727204502.19845: getting the remaining hosts for this loop 44842 1727204502.19846: done getting the remaining hosts for this loop 44842 1727204502.19849: getting the next task for host managed-node1 44842 1727204502.19854: done getting next task for host managed-node1 44842 1727204502.19857: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44842 1727204502.19860: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204502.19871: getting variables 44842 1727204502.19872: in VariableManager get_vars() 44842 1727204502.19904: Calling all_inventory to load vars for managed-node1 44842 1727204502.19906: Calling groups_inventory to load vars for managed-node1 44842 1727204502.19908: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204502.19917: Calling all_plugins_play to load vars for managed-node1 44842 1727204502.19919: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204502.19922: Calling groups_plugins_play to load vars for managed-node1 44842 1727204502.20158: done sending task result for task 0affcd87-79f5-aad0-d242-0000000003d5 44842 1727204502.20165: WORKER PROCESS EXITING 44842 1727204502.20171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204502.20441: done with get_vars() 44842 1727204502.20451: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:42 -0400 (0:00:01.701) 0:00:12.373 ***** 44842 1727204502.20547: entering _queue_task() for managed-node1/package_facts 44842 1727204502.20549: Creating lock for package_facts 44842 1727204502.20898: worker is 1 (out of 1 available) 44842 1727204502.20908: exiting _queue_task() for managed-node1/package_facts 44842 1727204502.20920: done queuing things up, now waiting for results queue to drain 44842 1727204502.20921: waiting for pending results... 44842 1727204502.21237: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 44842 1727204502.21401: in run() - task 0affcd87-79f5-aad0-d242-0000000003d6 44842 1727204502.21422: variable 'ansible_search_path' from source: unknown 44842 1727204502.21430: variable 'ansible_search_path' from source: unknown 44842 1727204502.21473: calling self._execute() 44842 1727204502.21560: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204502.21574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204502.21596: variable 'omit' from source: magic vars 44842 1727204502.21979: variable 'ansible_distribution_major_version' from source: facts 44842 1727204502.21998: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204502.22009: variable 'omit' from source: magic vars 44842 1727204502.22092: variable 'omit' from source: magic vars 44842 1727204502.22137: variable 'omit' from source: magic vars 44842 1727204502.22182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204502.22221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204502.22277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204502.22308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204502.22338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204502.22417: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204502.22440: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204502.22456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204502.22602: Set connection var ansible_shell_type to sh 44842 1727204502.22617: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204502.22620: Set connection var ansible_connection to ssh 44842 1727204502.22622: Set connection var ansible_pipelining to False 44842 1727204502.22630: Set connection var ansible_timeout to 10 44842 1727204502.22636: Set connection var ansible_shell_executable to /bin/sh 44842 1727204502.22653: variable 'ansible_shell_executable' from source: unknown 44842 1727204502.22656: variable 'ansible_connection' from source: unknown 44842 1727204502.22659: variable 'ansible_module_compression' from source: unknown 44842 1727204502.22661: variable 'ansible_shell_type' from source: unknown 44842 1727204502.22668: variable 'ansible_shell_executable' from source: unknown 44842 1727204502.22671: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204502.22676: variable 'ansible_pipelining' from source: unknown 44842 1727204502.22682: variable 'ansible_timeout' from source: unknown 44842 1727204502.22685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204502.22839: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204502.22846: variable 'omit' from source: magic vars 44842 1727204502.22851: starting attempt loop 44842 1727204502.22854: running the handler 44842 1727204502.22869: _low_level_execute_command(): starting 44842 1727204502.22876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204502.23396: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.23400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204502.23430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.23433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.23436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.23493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204502.23496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204502.23557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204502.25305: stdout chunk (state=3): >>>/root <<< 44842 1727204502.25308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204502.25310: stdout chunk (state=3): >>><<< 44842 1727204502.25312: stderr chunk (state=3): >>><<< 44842 1727204502.25421: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204502.25426: _low_level_execute_command(): starting 44842 1727204502.25429: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810 `" && echo ansible-tmp-1727204502.2532935-45827-200405744670810="` echo /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810 `" ) && sleep 0' 44842 1727204502.26249: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.26253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204502.26313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.26387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204502.26443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204502.28283: stdout chunk (state=3): >>>ansible-tmp-1727204502.2532935-45827-200405744670810=/root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810 <<< 44842 1727204502.28394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204502.28485: stderr chunk (state=3): >>><<< 44842 1727204502.28491: stdout chunk (state=3): >>><<< 44842 1727204502.28516: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204502.2532935-45827-200405744670810=/root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204502.28567: variable 'ansible_module_compression' from source: unknown 44842 1727204502.28626: ANSIBALLZ: Using lock for package_facts 44842 1727204502.28629: ANSIBALLZ: Acquiring lock 44842 1727204502.28632: ANSIBALLZ: Lock acquired: 140164877285280 44842 1727204502.28634: ANSIBALLZ: Creating module 44842 1727204502.52623: ANSIBALLZ: Writing module into payload 44842 1727204502.52737: ANSIBALLZ: Writing module 44842 1727204502.52767: ANSIBALLZ: Renaming module 44842 1727204502.52771: ANSIBALLZ: Done creating module 44842 1727204502.52788: variable 'ansible_facts' from source: unknown 44842 1727204502.52907: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810/AnsiballZ_package_facts.py 44842 1727204502.53251: Sending initial data 44842 1727204502.53262: Sent initial data (162 bytes) 44842 1727204502.53976: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204502.53982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204502.54017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.54030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.54042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.54090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204502.54102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204502.54170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204502.55871: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204502.55918: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204502.55974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmptsigitiz /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810/AnsiballZ_package_facts.py <<< 44842 1727204502.56077: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204502.58557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204502.58709: stderr chunk (state=3): >>><<< 44842 1727204502.58712: stdout chunk (state=3): >>><<< 44842 1727204502.58740: done transferring module to remote 44842 1727204502.58751: _low_level_execute_command(): starting 44842 1727204502.58756: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810/ /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810/AnsiballZ_package_facts.py && sleep 0' 44842 1727204502.59309: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.59316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204502.59349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.59354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204502.59367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.59376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204502.59382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.59434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204502.59452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204502.59511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204502.61209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204502.61270: stderr chunk (state=3): >>><<< 44842 1727204502.61274: stdout chunk (state=3): >>><<< 44842 1727204502.61289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204502.61299: _low_level_execute_command(): starting 44842 1727204502.61302: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810/AnsiballZ_package_facts.py && sleep 0' 44842 1727204502.61904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.61908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204502.61955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.61965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204502.61984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204502.61990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204502.62071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204502.62094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204502.62183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204503.08175: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version<<< 44842 1727204503.08307: stdout chunk (state=3): >>>": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"na<<< 44842 1727204503.08319: stdout chunk (state=3): >>>me": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "p<<< 44842 1727204503.08333: stdout chunk (state=3): >>>erl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap"<<< 44842 1727204503.08339: stdout chunk (state=3): >>>: [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 44842 1727204503.08359: stdout chunk (state=3): >>>}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils",<<< 44842 1727204503.08370: stdout chunk (state=3): >>> "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44842 1727204503.09883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204503.09995: stderr chunk (state=3): >>><<< 44842 1727204503.09998: stdout chunk (state=3): >>><<< 44842 1727204503.10078: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204503.12775: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204503.12803: _low_level_execute_command(): starting 44842 1727204503.12812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204502.2532935-45827-200405744670810/ > /dev/null 2>&1 && sleep 0' 44842 1727204503.13450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204503.13465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204503.13480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204503.13494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204503.13537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204503.13547: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204503.13561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204503.13581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204503.13592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204503.13602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204503.13616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204503.13628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204503.13643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204503.13654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204503.13665: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204503.13679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204503.13757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204503.13775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204503.13788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204503.13901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204503.15786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204503.15832: stderr chunk (state=3): >>><<< 44842 1727204503.15836: stdout chunk (state=3): >>><<< 44842 1727204503.15971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204503.15975: handler run complete 44842 1727204503.16824: variable 'ansible_facts' from source: unknown 44842 1727204503.17320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.19476: variable 'ansible_facts' from source: unknown 44842 1727204503.19953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.20746: attempt loop complete, returning result 44842 1727204503.20771: _execute() done 44842 1727204503.20778: dumping result to json 44842 1727204503.21001: done dumping result, returning 44842 1727204503.21015: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-aad0-d242-0000000003d6] 44842 1727204503.21023: sending task result for task 0affcd87-79f5-aad0-d242-0000000003d6 44842 1727204503.27624: done sending task result for task 0affcd87-79f5-aad0-d242-0000000003d6 44842 1727204503.27628: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204503.27697: no more pending results, returning what we have 44842 1727204503.27699: results queue empty 44842 1727204503.27700: checking for any_errors_fatal 44842 1727204503.27703: done checking for any_errors_fatal 44842 1727204503.27704: checking for max_fail_percentage 44842 1727204503.27705: done checking for max_fail_percentage 44842 1727204503.27706: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.27706: done checking to see if all hosts have failed 44842 1727204503.27707: getting the remaining hosts for this loop 44842 1727204503.27708: done getting the remaining hosts for this loop 44842 1727204503.27711: getting the next task for host managed-node1 44842 1727204503.27717: done getting next task for host managed-node1 44842 1727204503.27720: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44842 1727204503.27723: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.27731: getting variables 44842 1727204503.27732: in VariableManager get_vars() 44842 1727204503.27759: Calling all_inventory to load vars for managed-node1 44842 1727204503.27761: Calling groups_inventory to load vars for managed-node1 44842 1727204503.27765: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.27773: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.27776: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.27778: Calling groups_plugins_play to load vars for managed-node1 44842 1727204503.29104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.30831: done with get_vars() 44842 1727204503.30866: done getting variables 44842 1727204503.30933: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:43 -0400 (0:00:01.104) 0:00:13.477 ***** 44842 1727204503.30970: entering _queue_task() for managed-node1/debug 44842 1727204503.31357: worker is 1 (out of 1 available) 44842 1727204503.31372: exiting _queue_task() for managed-node1/debug 44842 1727204503.31384: done queuing things up, now waiting for results queue to drain 44842 1727204503.31385: waiting for pending results... 44842 1727204503.31714: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 44842 1727204503.31851: in run() - task 0affcd87-79f5-aad0-d242-000000000018 44842 1727204503.31875: variable 'ansible_search_path' from source: unknown 44842 1727204503.31879: variable 'ansible_search_path' from source: unknown 44842 1727204503.31926: calling self._execute() 44842 1727204503.32034: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.32048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.32075: variable 'omit' from source: magic vars 44842 1727204503.32531: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.32543: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204503.32555: variable 'omit' from source: magic vars 44842 1727204503.32627: variable 'omit' from source: magic vars 44842 1727204503.32758: variable 'network_provider' from source: set_fact 44842 1727204503.32781: variable 'omit' from source: magic vars 44842 1727204503.32834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204503.32877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204503.32905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204503.32927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204503.32939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204503.32970: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204503.32973: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.32976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.33080: Set connection var ansible_shell_type to sh 44842 1727204503.33091: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204503.33100: Set connection var ansible_connection to ssh 44842 1727204503.33107: Set connection var ansible_pipelining to False 44842 1727204503.33112: Set connection var ansible_timeout to 10 44842 1727204503.33120: Set connection var ansible_shell_executable to /bin/sh 44842 1727204503.33147: variable 'ansible_shell_executable' from source: unknown 44842 1727204503.33151: variable 'ansible_connection' from source: unknown 44842 1727204503.33154: variable 'ansible_module_compression' from source: unknown 44842 1727204503.33156: variable 'ansible_shell_type' from source: unknown 44842 1727204503.33159: variable 'ansible_shell_executable' from source: unknown 44842 1727204503.33165: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.33168: variable 'ansible_pipelining' from source: unknown 44842 1727204503.33171: variable 'ansible_timeout' from source: unknown 44842 1727204503.33173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.33407: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204503.33416: variable 'omit' from source: magic vars 44842 1727204503.33426: starting attempt loop 44842 1727204503.33429: running the handler 44842 1727204503.33847: handler run complete 44842 1727204503.33872: attempt loop complete, returning result 44842 1727204503.33880: _execute() done 44842 1727204503.33886: dumping result to json 44842 1727204503.33893: done dumping result, returning 44842 1727204503.33903: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-aad0-d242-000000000018] 44842 1727204503.33911: sending task result for task 0affcd87-79f5-aad0-d242-000000000018 44842 1727204503.34018: done sending task result for task 0affcd87-79f5-aad0-d242-000000000018 44842 1727204503.34026: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 44842 1727204503.34219: no more pending results, returning what we have 44842 1727204503.34222: results queue empty 44842 1727204503.34223: checking for any_errors_fatal 44842 1727204503.34230: done checking for any_errors_fatal 44842 1727204503.34230: checking for max_fail_percentage 44842 1727204503.34232: done checking for max_fail_percentage 44842 1727204503.34232: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.34233: done checking to see if all hosts have failed 44842 1727204503.34234: getting the remaining hosts for this loop 44842 1727204503.34235: done getting the remaining hosts for this loop 44842 1727204503.34238: getting the next task for host managed-node1 44842 1727204503.34243: done getting next task for host managed-node1 44842 1727204503.34247: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44842 1727204503.34250: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.34259: getting variables 44842 1727204503.34260: in VariableManager get_vars() 44842 1727204503.34295: Calling all_inventory to load vars for managed-node1 44842 1727204503.34298: Calling groups_inventory to load vars for managed-node1 44842 1727204503.34300: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.34308: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.34310: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.34313: Calling groups_plugins_play to load vars for managed-node1 44842 1727204503.35795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.37452: done with get_vars() 44842 1727204503.37483: done getting variables 44842 1727204503.37547: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.066) 0:00:13.543 ***** 44842 1727204503.37584: entering _queue_task() for managed-node1/fail 44842 1727204503.37900: worker is 1 (out of 1 available) 44842 1727204503.37914: exiting _queue_task() for managed-node1/fail 44842 1727204503.37928: done queuing things up, now waiting for results queue to drain 44842 1727204503.37929: waiting for pending results... 44842 1727204503.38236: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44842 1727204503.38366: in run() - task 0affcd87-79f5-aad0-d242-000000000019 44842 1727204503.38381: variable 'ansible_search_path' from source: unknown 44842 1727204503.38385: variable 'ansible_search_path' from source: unknown 44842 1727204503.38424: calling self._execute() 44842 1727204503.38523: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.38527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.38537: variable 'omit' from source: magic vars 44842 1727204503.38908: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.38917: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204503.39043: variable 'network_state' from source: role '' defaults 44842 1727204503.39053: Evaluated conditional (network_state != {}): False 44842 1727204503.39057: when evaluation is False, skipping this task 44842 1727204503.39062: _execute() done 44842 1727204503.39067: dumping result to json 44842 1727204503.39070: done dumping result, returning 44842 1727204503.39073: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-aad0-d242-000000000019] 44842 1727204503.39080: sending task result for task 0affcd87-79f5-aad0-d242-000000000019 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204503.39223: no more pending results, returning what we have 44842 1727204503.39227: results queue empty 44842 1727204503.39228: checking for any_errors_fatal 44842 1727204503.39234: done checking for any_errors_fatal 44842 1727204503.39235: checking for max_fail_percentage 44842 1727204503.39237: done checking for max_fail_percentage 44842 1727204503.39239: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.39240: done checking to see if all hosts have failed 44842 1727204503.39241: getting the remaining hosts for this loop 44842 1727204503.39243: done getting the remaining hosts for this loop 44842 1727204503.39247: getting the next task for host managed-node1 44842 1727204503.39255: done getting next task for host managed-node1 44842 1727204503.39260: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44842 1727204503.39265: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.39282: done sending task result for task 0affcd87-79f5-aad0-d242-000000000019 44842 1727204503.39288: WORKER PROCESS EXITING 44842 1727204503.39296: getting variables 44842 1727204503.39298: in VariableManager get_vars() 44842 1727204503.39339: Calling all_inventory to load vars for managed-node1 44842 1727204503.39343: Calling groups_inventory to load vars for managed-node1 44842 1727204503.39345: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.39358: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.39361: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.39366: Calling groups_plugins_play to load vars for managed-node1 44842 1727204503.41020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.42948: done with get_vars() 44842 1727204503.42973: done getting variables 44842 1727204503.43033: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.054) 0:00:13.598 ***** 44842 1727204503.43070: entering _queue_task() for managed-node1/fail 44842 1727204503.43380: worker is 1 (out of 1 available) 44842 1727204503.43394: exiting _queue_task() for managed-node1/fail 44842 1727204503.43405: done queuing things up, now waiting for results queue to drain 44842 1727204503.43407: waiting for pending results... 44842 1727204503.44081: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44842 1727204503.44086: in run() - task 0affcd87-79f5-aad0-d242-00000000001a 44842 1727204503.44090: variable 'ansible_search_path' from source: unknown 44842 1727204503.44100: variable 'ansible_search_path' from source: unknown 44842 1727204503.44135: calling self._execute() 44842 1727204503.44223: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.44227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.44240: variable 'omit' from source: magic vars 44842 1727204503.44714: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.44723: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204503.44844: variable 'network_state' from source: role '' defaults 44842 1727204503.44858: Evaluated conditional (network_state != {}): False 44842 1727204503.44865: when evaluation is False, skipping this task 44842 1727204503.44869: _execute() done 44842 1727204503.44871: dumping result to json 44842 1727204503.44874: done dumping result, returning 44842 1727204503.44879: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-aad0-d242-00000000001a] 44842 1727204503.44885: sending task result for task 0affcd87-79f5-aad0-d242-00000000001a 44842 1727204503.44975: done sending task result for task 0affcd87-79f5-aad0-d242-00000000001a 44842 1727204503.44979: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204503.45028: no more pending results, returning what we have 44842 1727204503.45032: results queue empty 44842 1727204503.45033: checking for any_errors_fatal 44842 1727204503.45040: done checking for any_errors_fatal 44842 1727204503.45040: checking for max_fail_percentage 44842 1727204503.45042: done checking for max_fail_percentage 44842 1727204503.45043: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.45044: done checking to see if all hosts have failed 44842 1727204503.45045: getting the remaining hosts for this loop 44842 1727204503.45046: done getting the remaining hosts for this loop 44842 1727204503.45051: getting the next task for host managed-node1 44842 1727204503.45059: done getting next task for host managed-node1 44842 1727204503.45063: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44842 1727204503.45068: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.45086: getting variables 44842 1727204503.45088: in VariableManager get_vars() 44842 1727204503.45129: Calling all_inventory to load vars for managed-node1 44842 1727204503.45132: Calling groups_inventory to load vars for managed-node1 44842 1727204503.45134: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.45146: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.45148: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.45151: Calling groups_plugins_play to load vars for managed-node1 44842 1727204503.46734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.48512: done with get_vars() 44842 1727204503.48543: done getting variables 44842 1727204503.48603: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.055) 0:00:13.654 ***** 44842 1727204503.48635: entering _queue_task() for managed-node1/fail 44842 1727204503.48928: worker is 1 (out of 1 available) 44842 1727204503.48942: exiting _queue_task() for managed-node1/fail 44842 1727204503.48955: done queuing things up, now waiting for results queue to drain 44842 1727204503.48957: waiting for pending results... 44842 1727204503.49706: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44842 1727204503.50073: in run() - task 0affcd87-79f5-aad0-d242-00000000001b 44842 1727204503.50077: variable 'ansible_search_path' from source: unknown 44842 1727204503.50080: variable 'ansible_search_path' from source: unknown 44842 1727204503.50119: calling self._execute() 44842 1727204503.50321: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.50324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.50336: variable 'omit' from source: magic vars 44842 1727204503.51199: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.51212: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204503.51677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204503.56399: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204503.56582: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204503.56618: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204503.56652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204503.56684: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204503.56755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.56788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.56814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.56853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.56868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.56963: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.56978: Evaluated conditional (ansible_distribution_major_version | int > 9): False 44842 1727204503.56982: when evaluation is False, skipping this task 44842 1727204503.56985: _execute() done 44842 1727204503.56993: dumping result to json 44842 1727204503.56996: done dumping result, returning 44842 1727204503.57005: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-aad0-d242-00000000001b] 44842 1727204503.57010: sending task result for task 0affcd87-79f5-aad0-d242-00000000001b 44842 1727204503.57112: done sending task result for task 0affcd87-79f5-aad0-d242-00000000001b 44842 1727204503.57116: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 44842 1727204503.57167: no more pending results, returning what we have 44842 1727204503.57172: results queue empty 44842 1727204503.57173: checking for any_errors_fatal 44842 1727204503.57180: done checking for any_errors_fatal 44842 1727204503.57181: checking for max_fail_percentage 44842 1727204503.57183: done checking for max_fail_percentage 44842 1727204503.57184: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.57185: done checking to see if all hosts have failed 44842 1727204503.57186: getting the remaining hosts for this loop 44842 1727204503.57188: done getting the remaining hosts for this loop 44842 1727204503.57192: getting the next task for host managed-node1 44842 1727204503.57199: done getting next task for host managed-node1 44842 1727204503.57203: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44842 1727204503.57207: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.57221: getting variables 44842 1727204503.57223: in VariableManager get_vars() 44842 1727204503.57270: Calling all_inventory to load vars for managed-node1 44842 1727204503.57274: Calling groups_inventory to load vars for managed-node1 44842 1727204503.57277: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.57288: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.57291: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.57294: Calling groups_plugins_play to load vars for managed-node1 44842 1727204503.60924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.64222: done with get_vars() 44842 1727204503.64262: done getting variables 44842 1727204503.64780: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.161) 0:00:13.816 ***** 44842 1727204503.64816: entering _queue_task() for managed-node1/dnf 44842 1727204503.65144: worker is 1 (out of 1 available) 44842 1727204503.65158: exiting _queue_task() for managed-node1/dnf 44842 1727204503.65172: done queuing things up, now waiting for results queue to drain 44842 1727204503.65174: waiting for pending results... 44842 1727204503.65465: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44842 1727204503.65615: in run() - task 0affcd87-79f5-aad0-d242-00000000001c 44842 1727204503.65634: variable 'ansible_search_path' from source: unknown 44842 1727204503.65693: variable 'ansible_search_path' from source: unknown 44842 1727204503.65734: calling self._execute() 44842 1727204503.65842: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.66058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.66079: variable 'omit' from source: magic vars 44842 1727204503.66733: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.66747: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204503.66978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204503.69596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204503.69681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204503.69722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204503.69756: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204503.69791: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204503.69913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.69947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.69975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.70022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.70036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.70477: variable 'ansible_distribution' from source: facts 44842 1727204503.70481: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.70502: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44842 1727204503.70630: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204503.70766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.70788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.70814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.70863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.70877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.70915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.70945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.70972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.71012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.71025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.71072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.71094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.71117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.71837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.71856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.72132: variable 'network_connections' from source: task vars 44842 1727204503.72263: variable 'interface' from source: set_fact 44842 1727204503.72332: variable 'interface' from source: set_fact 44842 1727204503.72339: variable 'interface' from source: set_fact 44842 1727204503.72411: variable 'interface' from source: set_fact 44842 1727204503.72515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204503.72698: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204503.72736: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204503.72784: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204503.72827: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204503.72867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204503.72890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204503.72925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.72951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204503.73007: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204503.73269: variable 'network_connections' from source: task vars 44842 1727204503.73275: variable 'interface' from source: set_fact 44842 1727204503.73335: variable 'interface' from source: set_fact 44842 1727204503.73347: variable 'interface' from source: set_fact 44842 1727204503.73409: variable 'interface' from source: set_fact 44842 1727204503.73473: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204503.73476: when evaluation is False, skipping this task 44842 1727204503.73479: _execute() done 44842 1727204503.73481: dumping result to json 44842 1727204503.73483: done dumping result, returning 44842 1727204503.73493: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-00000000001c] 44842 1727204503.73498: sending task result for task 0affcd87-79f5-aad0-d242-00000000001c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204503.73650: no more pending results, returning what we have 44842 1727204503.73654: results queue empty 44842 1727204503.73656: checking for any_errors_fatal 44842 1727204503.73666: done checking for any_errors_fatal 44842 1727204503.73667: checking for max_fail_percentage 44842 1727204503.73669: done checking for max_fail_percentage 44842 1727204503.73670: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.73670: done checking to see if all hosts have failed 44842 1727204503.73671: getting the remaining hosts for this loop 44842 1727204503.73673: done getting the remaining hosts for this loop 44842 1727204503.73679: getting the next task for host managed-node1 44842 1727204503.73687: done getting next task for host managed-node1 44842 1727204503.73692: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44842 1727204503.73695: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.73714: getting variables 44842 1727204503.73716: in VariableManager get_vars() 44842 1727204503.73757: Calling all_inventory to load vars for managed-node1 44842 1727204503.73761: Calling groups_inventory to load vars for managed-node1 44842 1727204503.73765: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.73777: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.73780: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.73784: Calling groups_plugins_play to load vars for managed-node1 44842 1727204503.74400: done sending task result for task 0affcd87-79f5-aad0-d242-00000000001c 44842 1727204503.74404: WORKER PROCESS EXITING 44842 1727204503.75511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.77290: done with get_vars() 44842 1727204503.77321: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44842 1727204503.77430: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.126) 0:00:13.942 ***** 44842 1727204503.77469: entering _queue_task() for managed-node1/yum 44842 1727204503.77471: Creating lock for yum 44842 1727204503.77798: worker is 1 (out of 1 available) 44842 1727204503.77810: exiting _queue_task() for managed-node1/yum 44842 1727204503.77822: done queuing things up, now waiting for results queue to drain 44842 1727204503.77823: waiting for pending results... 44842 1727204503.78130: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44842 1727204503.78265: in run() - task 0affcd87-79f5-aad0-d242-00000000001d 44842 1727204503.78282: variable 'ansible_search_path' from source: unknown 44842 1727204503.78286: variable 'ansible_search_path' from source: unknown 44842 1727204503.78325: calling self._execute() 44842 1727204503.78413: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.78418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.78436: variable 'omit' from source: magic vars 44842 1727204503.79803: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.79815: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204503.80230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204503.83659: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204503.84076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204503.84119: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204503.84153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204503.84181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204503.84265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.84291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.84318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.84369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.84383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.84488: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.84504: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44842 1727204503.84507: when evaluation is False, skipping this task 44842 1727204503.84510: _execute() done 44842 1727204503.84513: dumping result to json 44842 1727204503.84515: done dumping result, returning 44842 1727204503.84523: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-00000000001d] 44842 1727204503.84529: sending task result for task 0affcd87-79f5-aad0-d242-00000000001d 44842 1727204503.84634: done sending task result for task 0affcd87-79f5-aad0-d242-00000000001d 44842 1727204503.84637: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44842 1727204503.84702: no more pending results, returning what we have 44842 1727204503.84706: results queue empty 44842 1727204503.84707: checking for any_errors_fatal 44842 1727204503.84713: done checking for any_errors_fatal 44842 1727204503.84714: checking for max_fail_percentage 44842 1727204503.84716: done checking for max_fail_percentage 44842 1727204503.84717: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.84718: done checking to see if all hosts have failed 44842 1727204503.84719: getting the remaining hosts for this loop 44842 1727204503.84721: done getting the remaining hosts for this loop 44842 1727204503.84726: getting the next task for host managed-node1 44842 1727204503.84733: done getting next task for host managed-node1 44842 1727204503.84738: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44842 1727204503.84741: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.84758: getting variables 44842 1727204503.84760: in VariableManager get_vars() 44842 1727204503.84803: Calling all_inventory to load vars for managed-node1 44842 1727204503.84806: Calling groups_inventory to load vars for managed-node1 44842 1727204503.84809: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.84820: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.84823: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.84826: Calling groups_plugins_play to load vars for managed-node1 44842 1727204503.86652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204503.88362: done with get_vars() 44842 1727204503.88393: done getting variables 44842 1727204503.88457: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.110) 0:00:14.053 ***** 44842 1727204503.88498: entering _queue_task() for managed-node1/fail 44842 1727204503.88826: worker is 1 (out of 1 available) 44842 1727204503.88840: exiting _queue_task() for managed-node1/fail 44842 1727204503.88854: done queuing things up, now waiting for results queue to drain 44842 1727204503.88855: waiting for pending results... 44842 1727204503.89157: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44842 1727204503.89283: in run() - task 0affcd87-79f5-aad0-d242-00000000001e 44842 1727204503.89301: variable 'ansible_search_path' from source: unknown 44842 1727204503.89305: variable 'ansible_search_path' from source: unknown 44842 1727204503.89346: calling self._execute() 44842 1727204503.89434: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204503.89445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204503.89455: variable 'omit' from source: magic vars 44842 1727204503.89838: variable 'ansible_distribution_major_version' from source: facts 44842 1727204503.89856: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204503.89984: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204503.90185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204503.94157: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204503.94237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204503.94386: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204503.94420: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204503.94448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204503.94638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.94669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.94699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.94853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.94868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.94913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.95058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.95085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.95124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.95252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.95294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204503.95315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204503.95337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.95488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204503.95501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204503.95897: variable 'network_connections' from source: task vars 44842 1727204503.95910: variable 'interface' from source: set_fact 44842 1727204503.95985: variable 'interface' from source: set_fact 44842 1727204503.95993: variable 'interface' from source: set_fact 44842 1727204503.96167: variable 'interface' from source: set_fact 44842 1727204503.96372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204503.96646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204503.96801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204503.96845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204503.96988: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204503.97031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204503.97054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204503.97081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204503.97222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204503.97290: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204503.97888: variable 'network_connections' from source: task vars 44842 1727204503.97892: variable 'interface' from source: set_fact 44842 1727204503.97955: variable 'interface' from source: set_fact 44842 1727204503.98080: variable 'interface' from source: set_fact 44842 1727204503.98140: variable 'interface' from source: set_fact 44842 1727204503.98419: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204503.98422: when evaluation is False, skipping this task 44842 1727204503.98425: _execute() done 44842 1727204503.98427: dumping result to json 44842 1727204503.98429: done dumping result, returning 44842 1727204503.98439: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-00000000001e] 44842 1727204503.98451: sending task result for task 0affcd87-79f5-aad0-d242-00000000001e 44842 1727204503.98545: done sending task result for task 0affcd87-79f5-aad0-d242-00000000001e 44842 1727204503.98549: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204503.98609: no more pending results, returning what we have 44842 1727204503.98613: results queue empty 44842 1727204503.98614: checking for any_errors_fatal 44842 1727204503.98620: done checking for any_errors_fatal 44842 1727204503.98621: checking for max_fail_percentage 44842 1727204503.98623: done checking for max_fail_percentage 44842 1727204503.98624: checking to see if all hosts have failed and the running result is not ok 44842 1727204503.98625: done checking to see if all hosts have failed 44842 1727204503.98626: getting the remaining hosts for this loop 44842 1727204503.98628: done getting the remaining hosts for this loop 44842 1727204503.98633: getting the next task for host managed-node1 44842 1727204503.98640: done getting next task for host managed-node1 44842 1727204503.98645: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44842 1727204503.98648: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204503.98665: getting variables 44842 1727204503.98667: in VariableManager get_vars() 44842 1727204503.98707: Calling all_inventory to load vars for managed-node1 44842 1727204503.98710: Calling groups_inventory to load vars for managed-node1 44842 1727204503.98713: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204503.98723: Calling all_plugins_play to load vars for managed-node1 44842 1727204503.98726: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204503.98729: Calling groups_plugins_play to load vars for managed-node1 44842 1727204504.01384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204504.06236: done with get_vars() 44842 1727204504.06274: done getting variables 44842 1727204504.06336: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.178) 0:00:14.231 ***** 44842 1727204504.06374: entering _queue_task() for managed-node1/package 44842 1727204504.07585: worker is 1 (out of 1 available) 44842 1727204504.07599: exiting _queue_task() for managed-node1/package 44842 1727204504.07611: done queuing things up, now waiting for results queue to drain 44842 1727204504.07613: waiting for pending results... 44842 1727204504.08389: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 44842 1727204504.08523: in run() - task 0affcd87-79f5-aad0-d242-00000000001f 44842 1727204504.08543: variable 'ansible_search_path' from source: unknown 44842 1727204504.08551: variable 'ansible_search_path' from source: unknown 44842 1727204504.08596: calling self._execute() 44842 1727204504.08688: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204504.08699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204504.08713: variable 'omit' from source: magic vars 44842 1727204504.09072: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.09544: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204504.09751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204504.10680: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204504.10792: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204504.10827: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204504.10866: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204504.10987: variable 'network_packages' from source: role '' defaults 44842 1727204504.11155: variable '__network_provider_setup' from source: role '' defaults 44842 1727204504.11178: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204504.11251: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204504.11271: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204504.11338: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204504.11530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204504.23029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204504.23109: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204504.23149: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204504.23190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204504.23218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204504.23295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.23328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.23358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.23407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.23427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.23476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.23503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.23532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.23578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.23595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.24467: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44842 1727204504.24791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.24820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.24850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.24897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.24917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.25037: variable 'ansible_python' from source: facts 44842 1727204504.25069: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44842 1727204504.25157: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204504.25269: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204504.25410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.25439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.25474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.25515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.25535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.25608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.25648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.25683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.25729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.25747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.25903: variable 'network_connections' from source: task vars 44842 1727204504.25914: variable 'interface' from source: set_fact 44842 1727204504.26027: variable 'interface' from source: set_fact 44842 1727204504.26043: variable 'interface' from source: set_fact 44842 1727204504.26153: variable 'interface' from source: set_fact 44842 1727204504.26254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204504.26295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204504.26331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.26388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204504.26429: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204504.26731: variable 'network_connections' from source: task vars 44842 1727204504.26741: variable 'interface' from source: set_fact 44842 1727204504.26867: variable 'interface' from source: set_fact 44842 1727204504.26882: variable 'interface' from source: set_fact 44842 1727204504.26993: variable 'interface' from source: set_fact 44842 1727204504.27080: variable '__network_packages_default_wireless' from source: role '' defaults 44842 1727204504.27166: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204504.27518: variable 'network_connections' from source: task vars 44842 1727204504.27528: variable 'interface' from source: set_fact 44842 1727204504.27601: variable 'interface' from source: set_fact 44842 1727204504.27614: variable 'interface' from source: set_fact 44842 1727204504.27693: variable 'interface' from source: set_fact 44842 1727204504.27745: variable '__network_packages_default_team' from source: role '' defaults 44842 1727204504.27845: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204504.28193: variable 'network_connections' from source: task vars 44842 1727204504.28203: variable 'interface' from source: set_fact 44842 1727204504.28285: variable 'interface' from source: set_fact 44842 1727204504.28302: variable 'interface' from source: set_fact 44842 1727204504.28372: variable 'interface' from source: set_fact 44842 1727204504.28456: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204504.28536: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204504.28548: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204504.28617: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204504.28845: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44842 1727204504.29373: variable 'network_connections' from source: task vars 44842 1727204504.29386: variable 'interface' from source: set_fact 44842 1727204504.29449: variable 'interface' from source: set_fact 44842 1727204504.29463: variable 'interface' from source: set_fact 44842 1727204504.29527: variable 'interface' from source: set_fact 44842 1727204504.29570: variable 'ansible_distribution' from source: facts 44842 1727204504.29581: variable '__network_rh_distros' from source: role '' defaults 44842 1727204504.29591: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.29619: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44842 1727204504.29813: variable 'ansible_distribution' from source: facts 44842 1727204504.29824: variable '__network_rh_distros' from source: role '' defaults 44842 1727204504.29834: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.29849: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44842 1727204504.30020: variable 'ansible_distribution' from source: facts 44842 1727204504.30030: variable '__network_rh_distros' from source: role '' defaults 44842 1727204504.30040: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.30087: variable 'network_provider' from source: set_fact 44842 1727204504.30107: variable 'ansible_facts' from source: unknown 44842 1727204504.30822: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44842 1727204504.30830: when evaluation is False, skipping this task 44842 1727204504.30837: _execute() done 44842 1727204504.30845: dumping result to json 44842 1727204504.30852: done dumping result, returning 44842 1727204504.30870: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-aad0-d242-00000000001f] 44842 1727204504.30879: sending task result for task 0affcd87-79f5-aad0-d242-00000000001f skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44842 1727204504.32304: no more pending results, returning what we have 44842 1727204504.32307: results queue empty 44842 1727204504.32308: checking for any_errors_fatal 44842 1727204504.32314: done checking for any_errors_fatal 44842 1727204504.32314: checking for max_fail_percentage 44842 1727204504.32316: done checking for max_fail_percentage 44842 1727204504.32317: checking to see if all hosts have failed and the running result is not ok 44842 1727204504.32318: done checking to see if all hosts have failed 44842 1727204504.32319: getting the remaining hosts for this loop 44842 1727204504.32320: done getting the remaining hosts for this loop 44842 1727204504.32324: getting the next task for host managed-node1 44842 1727204504.32330: done getting next task for host managed-node1 44842 1727204504.32334: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44842 1727204504.32338: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204504.32353: getting variables 44842 1727204504.32354: in VariableManager get_vars() 44842 1727204504.32391: Calling all_inventory to load vars for managed-node1 44842 1727204504.32399: Calling groups_inventory to load vars for managed-node1 44842 1727204504.32401: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204504.32411: Calling all_plugins_play to load vars for managed-node1 44842 1727204504.32413: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204504.32416: Calling groups_plugins_play to load vars for managed-node1 44842 1727204504.33233: done sending task result for task 0affcd87-79f5-aad0-d242-00000000001f 44842 1727204504.33237: WORKER PROCESS EXITING 44842 1727204504.43074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204504.45596: done with get_vars() 44842 1727204504.45627: done getting variables 44842 1727204504.45696: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.393) 0:00:14.625 ***** 44842 1727204504.45729: entering _queue_task() for managed-node1/package 44842 1727204504.46096: worker is 1 (out of 1 available) 44842 1727204504.46108: exiting _queue_task() for managed-node1/package 44842 1727204504.46122: done queuing things up, now waiting for results queue to drain 44842 1727204504.46123: waiting for pending results... 44842 1727204504.46419: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44842 1727204504.46569: in run() - task 0affcd87-79f5-aad0-d242-000000000020 44842 1727204504.46586: variable 'ansible_search_path' from source: unknown 44842 1727204504.46590: variable 'ansible_search_path' from source: unknown 44842 1727204504.46628: calling self._execute() 44842 1727204504.46737: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204504.46746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204504.46757: variable 'omit' from source: magic vars 44842 1727204504.47169: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.47186: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204504.47313: variable 'network_state' from source: role '' defaults 44842 1727204504.47326: Evaluated conditional (network_state != {}): False 44842 1727204504.47329: when evaluation is False, skipping this task 44842 1727204504.47332: _execute() done 44842 1727204504.47335: dumping result to json 44842 1727204504.47338: done dumping result, returning 44842 1727204504.47345: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-aad0-d242-000000000020] 44842 1727204504.47352: sending task result for task 0affcd87-79f5-aad0-d242-000000000020 44842 1727204504.47452: done sending task result for task 0affcd87-79f5-aad0-d242-000000000020 44842 1727204504.47456: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204504.47511: no more pending results, returning what we have 44842 1727204504.47516: results queue empty 44842 1727204504.47517: checking for any_errors_fatal 44842 1727204504.47525: done checking for any_errors_fatal 44842 1727204504.47526: checking for max_fail_percentage 44842 1727204504.47528: done checking for max_fail_percentage 44842 1727204504.47529: checking to see if all hosts have failed and the running result is not ok 44842 1727204504.47530: done checking to see if all hosts have failed 44842 1727204504.47531: getting the remaining hosts for this loop 44842 1727204504.47533: done getting the remaining hosts for this loop 44842 1727204504.47537: getting the next task for host managed-node1 44842 1727204504.47544: done getting next task for host managed-node1 44842 1727204504.47549: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44842 1727204504.47552: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204504.47572: getting variables 44842 1727204504.47574: in VariableManager get_vars() 44842 1727204504.47615: Calling all_inventory to load vars for managed-node1 44842 1727204504.47618: Calling groups_inventory to load vars for managed-node1 44842 1727204504.47621: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204504.47632: Calling all_plugins_play to load vars for managed-node1 44842 1727204504.47634: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204504.47637: Calling groups_plugins_play to load vars for managed-node1 44842 1727204504.49346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204504.51438: done with get_vars() 44842 1727204504.51481: done getting variables 44842 1727204504.51548: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.058) 0:00:14.683 ***** 44842 1727204504.51588: entering _queue_task() for managed-node1/package 44842 1727204504.51938: worker is 1 (out of 1 available) 44842 1727204504.51967: exiting _queue_task() for managed-node1/package 44842 1727204504.51981: done queuing things up, now waiting for results queue to drain 44842 1727204504.51982: waiting for pending results... 44842 1727204504.52278: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44842 1727204504.52437: in run() - task 0affcd87-79f5-aad0-d242-000000000021 44842 1727204504.52451: variable 'ansible_search_path' from source: unknown 44842 1727204504.52456: variable 'ansible_search_path' from source: unknown 44842 1727204504.52499: calling self._execute() 44842 1727204504.52599: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204504.52603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204504.52613: variable 'omit' from source: magic vars 44842 1727204504.53039: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.53052: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204504.53214: variable 'network_state' from source: role '' defaults 44842 1727204504.53225: Evaluated conditional (network_state != {}): False 44842 1727204504.53229: when evaluation is False, skipping this task 44842 1727204504.53232: _execute() done 44842 1727204504.53234: dumping result to json 44842 1727204504.53237: done dumping result, returning 44842 1727204504.53249: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-aad0-d242-000000000021] 44842 1727204504.53256: sending task result for task 0affcd87-79f5-aad0-d242-000000000021 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204504.53407: no more pending results, returning what we have 44842 1727204504.53411: results queue empty 44842 1727204504.53412: checking for any_errors_fatal 44842 1727204504.53420: done checking for any_errors_fatal 44842 1727204504.53421: checking for max_fail_percentage 44842 1727204504.53423: done checking for max_fail_percentage 44842 1727204504.53424: checking to see if all hosts have failed and the running result is not ok 44842 1727204504.53425: done checking to see if all hosts have failed 44842 1727204504.53425: getting the remaining hosts for this loop 44842 1727204504.53427: done getting the remaining hosts for this loop 44842 1727204504.53431: getting the next task for host managed-node1 44842 1727204504.53438: done getting next task for host managed-node1 44842 1727204504.53442: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44842 1727204504.53445: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204504.53467: done sending task result for task 0affcd87-79f5-aad0-d242-000000000021 44842 1727204504.53471: WORKER PROCESS EXITING 44842 1727204504.53485: getting variables 44842 1727204504.53487: in VariableManager get_vars() 44842 1727204504.53529: Calling all_inventory to load vars for managed-node1 44842 1727204504.53532: Calling groups_inventory to load vars for managed-node1 44842 1727204504.53534: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204504.53546: Calling all_plugins_play to load vars for managed-node1 44842 1727204504.53549: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204504.53552: Calling groups_plugins_play to load vars for managed-node1 44842 1727204504.55684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204504.58212: done with get_vars() 44842 1727204504.58241: done getting variables 44842 1727204504.58350: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.067) 0:00:14.751 ***** 44842 1727204504.58389: entering _queue_task() for managed-node1/service 44842 1727204504.58391: Creating lock for service 44842 1727204504.58808: worker is 1 (out of 1 available) 44842 1727204504.58819: exiting _queue_task() for managed-node1/service 44842 1727204504.58837: done queuing things up, now waiting for results queue to drain 44842 1727204504.58839: waiting for pending results... 44842 1727204504.59389: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44842 1727204504.59520: in run() - task 0affcd87-79f5-aad0-d242-000000000022 44842 1727204504.59533: variable 'ansible_search_path' from source: unknown 44842 1727204504.59537: variable 'ansible_search_path' from source: unknown 44842 1727204504.59576: calling self._execute() 44842 1727204504.59681: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204504.59685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204504.59696: variable 'omit' from source: magic vars 44842 1727204504.60093: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.60105: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204504.60234: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204504.60447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204504.63734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204504.64482: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204504.64521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204504.64559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204504.64596: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204504.64684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.64719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.64745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.64800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.64815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.64861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.64897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.64927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.64972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.64992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.65037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.65071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.65092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.65136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.65149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.65345: variable 'network_connections' from source: task vars 44842 1727204504.65358: variable 'interface' from source: set_fact 44842 1727204504.65479: variable 'interface' from source: set_fact 44842 1727204504.65488: variable 'interface' from source: set_fact 44842 1727204504.65736: variable 'interface' from source: set_fact 44842 1727204504.65740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204504.65930: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204504.65971: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204504.66082: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204504.66131: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204504.66182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204504.66205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204504.66235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.66261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204504.66332: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204504.66596: variable 'network_connections' from source: task vars 44842 1727204504.66600: variable 'interface' from source: set_fact 44842 1727204504.66673: variable 'interface' from source: set_fact 44842 1727204504.66679: variable 'interface' from source: set_fact 44842 1727204504.66743: variable 'interface' from source: set_fact 44842 1727204504.66804: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204504.66807: when evaluation is False, skipping this task 44842 1727204504.66809: _execute() done 44842 1727204504.66812: dumping result to json 44842 1727204504.66814: done dumping result, returning 44842 1727204504.66823: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-000000000022] 44842 1727204504.66838: sending task result for task 0affcd87-79f5-aad0-d242-000000000022 44842 1727204504.66934: done sending task result for task 0affcd87-79f5-aad0-d242-000000000022 44842 1727204504.66937: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204504.66996: no more pending results, returning what we have 44842 1727204504.67000: results queue empty 44842 1727204504.67001: checking for any_errors_fatal 44842 1727204504.67009: done checking for any_errors_fatal 44842 1727204504.67010: checking for max_fail_percentage 44842 1727204504.67012: done checking for max_fail_percentage 44842 1727204504.67013: checking to see if all hosts have failed and the running result is not ok 44842 1727204504.67014: done checking to see if all hosts have failed 44842 1727204504.67015: getting the remaining hosts for this loop 44842 1727204504.67017: done getting the remaining hosts for this loop 44842 1727204504.67021: getting the next task for host managed-node1 44842 1727204504.67029: done getting next task for host managed-node1 44842 1727204504.67033: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44842 1727204504.67036: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204504.67051: getting variables 44842 1727204504.67053: in VariableManager get_vars() 44842 1727204504.67101: Calling all_inventory to load vars for managed-node1 44842 1727204504.67105: Calling groups_inventory to load vars for managed-node1 44842 1727204504.67108: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204504.67118: Calling all_plugins_play to load vars for managed-node1 44842 1727204504.67121: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204504.67124: Calling groups_plugins_play to load vars for managed-node1 44842 1727204504.69303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204504.71601: done with get_vars() 44842 1727204504.71629: done getting variables 44842 1727204504.71741: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.133) 0:00:14.885 ***** 44842 1727204504.71784: entering _queue_task() for managed-node1/service 44842 1727204504.72314: worker is 1 (out of 1 available) 44842 1727204504.72325: exiting _queue_task() for managed-node1/service 44842 1727204504.72335: done queuing things up, now waiting for results queue to drain 44842 1727204504.72336: waiting for pending results... 44842 1727204504.72740: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44842 1727204504.72941: in run() - task 0affcd87-79f5-aad0-d242-000000000023 44842 1727204504.72953: variable 'ansible_search_path' from source: unknown 44842 1727204504.72957: variable 'ansible_search_path' from source: unknown 44842 1727204504.73143: calling self._execute() 44842 1727204504.73248: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204504.73252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204504.73267: variable 'omit' from source: magic vars 44842 1727204504.73827: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.73840: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204504.74024: variable 'network_provider' from source: set_fact 44842 1727204504.74028: variable 'network_state' from source: role '' defaults 44842 1727204504.74042: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44842 1727204504.74048: variable 'omit' from source: magic vars 44842 1727204504.74112: variable 'omit' from source: magic vars 44842 1727204504.74139: variable 'network_service_name' from source: role '' defaults 44842 1727204504.74221: variable 'network_service_name' from source: role '' defaults 44842 1727204504.74368: variable '__network_provider_setup' from source: role '' defaults 44842 1727204504.74375: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204504.74445: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204504.74453: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204504.74532: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204504.74814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204504.78341: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204504.78429: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204504.78469: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204504.78510: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204504.78539: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204504.79292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.79325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.79349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.79402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.79418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.79465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.79492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.79699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.79745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.79759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.80247: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44842 1727204504.80380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.80410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.80441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.80491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.80511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.80614: variable 'ansible_python' from source: facts 44842 1727204504.80656: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44842 1727204504.80879: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204504.81108: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204504.81477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.81508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.81540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.81592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.81711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.81766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204504.81869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204504.81933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.82237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204504.82258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204504.82474: variable 'network_connections' from source: task vars 44842 1727204504.82493: variable 'interface' from source: set_fact 44842 1727204504.82600: variable 'interface' from source: set_fact 44842 1727204504.82620: variable 'interface' from source: set_fact 44842 1727204504.82710: variable 'interface' from source: set_fact 44842 1727204504.82971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204504.83188: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204504.83250: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204504.83300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204504.83352: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204504.83422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204504.83461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204504.83502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204504.83549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204504.83610: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204504.83804: variable 'network_connections' from source: task vars 44842 1727204504.83810: variable 'interface' from source: set_fact 44842 1727204504.83866: variable 'interface' from source: set_fact 44842 1727204504.83876: variable 'interface' from source: set_fact 44842 1727204504.83928: variable 'interface' from source: set_fact 44842 1727204504.84053: variable '__network_packages_default_wireless' from source: role '' defaults 44842 1727204504.84112: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204504.84433: variable 'network_connections' from source: task vars 44842 1727204504.84446: variable 'interface' from source: set_fact 44842 1727204504.84528: variable 'interface' from source: set_fact 44842 1727204504.84551: variable 'interface' from source: set_fact 44842 1727204504.84627: variable 'interface' from source: set_fact 44842 1727204504.84698: variable '__network_packages_default_team' from source: role '' defaults 44842 1727204504.84795: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204504.85131: variable 'network_connections' from source: task vars 44842 1727204504.85141: variable 'interface' from source: set_fact 44842 1727204504.85267: variable 'interface' from source: set_fact 44842 1727204504.85285: variable 'interface' from source: set_fact 44842 1727204504.85429: variable 'interface' from source: set_fact 44842 1727204504.85691: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204504.85770: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204504.85781: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204504.85858: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204504.86122: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44842 1727204504.86713: variable 'network_connections' from source: task vars 44842 1727204504.86726: variable 'interface' from source: set_fact 44842 1727204504.86801: variable 'interface' from source: set_fact 44842 1727204504.86813: variable 'interface' from source: set_fact 44842 1727204504.86882: variable 'interface' from source: set_fact 44842 1727204504.86993: variable 'ansible_distribution' from source: facts 44842 1727204504.87002: variable '__network_rh_distros' from source: role '' defaults 44842 1727204504.87011: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.87046: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44842 1727204504.87328: variable 'ansible_distribution' from source: facts 44842 1727204504.87336: variable '__network_rh_distros' from source: role '' defaults 44842 1727204504.87344: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.87359: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44842 1727204504.87548: variable 'ansible_distribution' from source: facts 44842 1727204504.87559: variable '__network_rh_distros' from source: role '' defaults 44842 1727204504.87574: variable 'ansible_distribution_major_version' from source: facts 44842 1727204504.87622: variable 'network_provider' from source: set_fact 44842 1727204504.87651: variable 'omit' from source: magic vars 44842 1727204504.87688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204504.87729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204504.87755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204504.87784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204504.87804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204504.87845: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204504.87853: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204504.87862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204504.87970: Set connection var ansible_shell_type to sh 44842 1727204504.87985: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204504.87994: Set connection var ansible_connection to ssh 44842 1727204504.88003: Set connection var ansible_pipelining to False 44842 1727204504.88015: Set connection var ansible_timeout to 10 44842 1727204504.88026: Set connection var ansible_shell_executable to /bin/sh 44842 1727204504.88062: variable 'ansible_shell_executable' from source: unknown 44842 1727204504.88072: variable 'ansible_connection' from source: unknown 44842 1727204504.88079: variable 'ansible_module_compression' from source: unknown 44842 1727204504.88085: variable 'ansible_shell_type' from source: unknown 44842 1727204504.88091: variable 'ansible_shell_executable' from source: unknown 44842 1727204504.88097: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204504.88108: variable 'ansible_pipelining' from source: unknown 44842 1727204504.88114: variable 'ansible_timeout' from source: unknown 44842 1727204504.88123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204504.88239: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204504.88267: variable 'omit' from source: magic vars 44842 1727204504.88279: starting attempt loop 44842 1727204504.88286: running the handler 44842 1727204504.88373: variable 'ansible_facts' from source: unknown 44842 1727204504.88998: _low_level_execute_command(): starting 44842 1727204504.89005: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204504.89665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204504.89680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204504.89698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204504.89715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204504.89753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204504.89761: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204504.89777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204504.89790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204504.89802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204504.89810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204504.89817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204504.89826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204504.89838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204504.89845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204504.89852: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204504.89861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204504.89940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204504.89959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204504.89977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204504.90063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204504.91720: stdout chunk (state=3): >>>/root <<< 44842 1727204504.91894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204504.91898: stderr chunk (state=3): >>><<< 44842 1727204504.91901: stdout chunk (state=3): >>><<< 44842 1727204504.91921: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204504.91931: _low_level_execute_command(): starting 44842 1727204504.91937: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865 `" && echo ansible-tmp-1727204504.9191995-46119-74824782731865="` echo /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865 `" ) && sleep 0' 44842 1727204504.92551: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204504.92555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204504.92569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204504.92583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204504.92620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204504.92626: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204504.92636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204504.92648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204504.92659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204504.92662: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204504.92675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204504.92684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204504.92695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204504.92702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204504.92708: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204504.92717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204504.92796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204504.92810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204504.92820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204504.92906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204504.94766: stdout chunk (state=3): >>>ansible-tmp-1727204504.9191995-46119-74824782731865=/root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865 <<< 44842 1727204504.94885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204504.94971: stderr chunk (state=3): >>><<< 44842 1727204504.94988: stdout chunk (state=3): >>><<< 44842 1727204504.95175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204504.9191995-46119-74824782731865=/root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204504.95184: variable 'ansible_module_compression' from source: unknown 44842 1727204504.95188: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 44842 1727204504.95191: ANSIBALLZ: Acquiring lock 44842 1727204504.95193: ANSIBALLZ: Lock acquired: 140164881036544 44842 1727204504.95195: ANSIBALLZ: Creating module 44842 1727204505.29767: ANSIBALLZ: Writing module into payload 44842 1727204505.30136: ANSIBALLZ: Writing module 44842 1727204505.30178: ANSIBALLZ: Renaming module 44842 1727204505.30200: ANSIBALLZ: Done creating module 44842 1727204505.30243: variable 'ansible_facts' from source: unknown 44842 1727204505.30477: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865/AnsiballZ_systemd.py 44842 1727204505.30663: Sending initial data 44842 1727204505.30668: Sent initial data (155 bytes) 44842 1727204505.31656: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204505.31674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204505.31691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.31707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.31753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204505.31769: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204505.31783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.31798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204505.31807: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204505.31816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204505.31829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204505.31840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.31852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.31862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204505.31875: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204505.31886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.31963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204505.31988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204505.32002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204505.32086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204505.33830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204505.33893: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204505.33950: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp0ysmn788 /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865/AnsiballZ_systemd.py <<< 44842 1727204505.33999: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204505.36667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204505.36871: stderr chunk (state=3): >>><<< 44842 1727204505.36874: stdout chunk (state=3): >>><<< 44842 1727204505.36876: done transferring module to remote 44842 1727204505.36955: _low_level_execute_command(): starting 44842 1727204505.36958: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865/ /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865/AnsiballZ_systemd.py && sleep 0' 44842 1727204505.37515: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204505.37529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204505.37542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.37558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.37603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204505.37615: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204505.37628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.37646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204505.37659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204505.37672: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204505.37684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204505.37698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.37714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.37725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204505.37734: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204505.37746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.37821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204505.37837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204505.37851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204505.37942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204505.39643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204505.39713: stderr chunk (state=3): >>><<< 44842 1727204505.39716: stdout chunk (state=3): >>><<< 44842 1727204505.39809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204505.39815: _low_level_execute_command(): starting 44842 1727204505.39818: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865/AnsiballZ_systemd.py && sleep 0' 44842 1727204505.40668: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204505.40688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204505.40712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.40731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.40773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204505.40785: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204505.40809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.40832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204505.40844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204505.40854: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204505.40868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204505.40885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.40905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.40924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204505.40939: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204505.40954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.41040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204505.41061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204505.41079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204505.41244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204505.66202: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 44842 1727204505.66235: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "14221312", "MemoryAvailable": "infinity", "CPUUsageNSec": "1610263000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogS<<< 44842 1727204505.66243: stdout chunk (state=3): >>>ignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44842 1727204505.67689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204505.67719: stderr chunk (state=3): >>><<< 44842 1727204505.67722: stdout chunk (state=3): >>><<< 44842 1727204505.67873: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14221312", "MemoryAvailable": "infinity", "CPUUsageNSec": "1610263000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204505.67972: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204505.67976: _low_level_execute_command(): starting 44842 1727204505.67978: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204504.9191995-46119-74824782731865/ > /dev/null 2>&1 && sleep 0' 44842 1727204505.69227: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.69242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.69275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204505.69281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204505.69287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204505.69294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204505.69301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.69310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.69315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.69384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204505.69387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204505.69457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204505.71205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204505.71301: stderr chunk (state=3): >>><<< 44842 1727204505.71313: stdout chunk (state=3): >>><<< 44842 1727204505.71474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204505.71477: handler run complete 44842 1727204505.71480: attempt loop complete, returning result 44842 1727204505.71482: _execute() done 44842 1727204505.71484: dumping result to json 44842 1727204505.71486: done dumping result, returning 44842 1727204505.71488: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-aad0-d242-000000000023] 44842 1727204505.71490: sending task result for task 0affcd87-79f5-aad0-d242-000000000023 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204505.71833: no more pending results, returning what we have 44842 1727204505.71837: results queue empty 44842 1727204505.71838: checking for any_errors_fatal 44842 1727204505.71844: done checking for any_errors_fatal 44842 1727204505.71845: checking for max_fail_percentage 44842 1727204505.71847: done checking for max_fail_percentage 44842 1727204505.71848: checking to see if all hosts have failed and the running result is not ok 44842 1727204505.71849: done checking to see if all hosts have failed 44842 1727204505.71849: getting the remaining hosts for this loop 44842 1727204505.71851: done getting the remaining hosts for this loop 44842 1727204505.71855: getting the next task for host managed-node1 44842 1727204505.71863: done getting next task for host managed-node1 44842 1727204505.71871: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44842 1727204505.71876: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204505.71891: getting variables 44842 1727204505.71893: in VariableManager get_vars() 44842 1727204505.71934: Calling all_inventory to load vars for managed-node1 44842 1727204505.71937: Calling groups_inventory to load vars for managed-node1 44842 1727204505.71940: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204505.71950: Calling all_plugins_play to load vars for managed-node1 44842 1727204505.71953: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204505.71959: Calling groups_plugins_play to load vars for managed-node1 44842 1727204505.72786: done sending task result for task 0affcd87-79f5-aad0-d242-000000000023 44842 1727204505.72789: WORKER PROCESS EXITING 44842 1727204505.73707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204505.75023: done with get_vars() 44842 1727204505.75059: done getting variables 44842 1727204505.75122: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:45 -0400 (0:00:01.033) 0:00:15.919 ***** 44842 1727204505.75168: entering _queue_task() for managed-node1/service 44842 1727204505.75533: worker is 1 (out of 1 available) 44842 1727204505.75546: exiting _queue_task() for managed-node1/service 44842 1727204505.75563: done queuing things up, now waiting for results queue to drain 44842 1727204505.75568: waiting for pending results... 44842 1727204505.75874: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44842 1727204505.76018: in run() - task 0affcd87-79f5-aad0-d242-000000000024 44842 1727204505.76034: variable 'ansible_search_path' from source: unknown 44842 1727204505.76039: variable 'ansible_search_path' from source: unknown 44842 1727204505.76075: calling self._execute() 44842 1727204505.76173: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204505.76177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204505.76185: variable 'omit' from source: magic vars 44842 1727204505.76716: variable 'ansible_distribution_major_version' from source: facts 44842 1727204505.76741: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204505.76871: variable 'network_provider' from source: set_fact 44842 1727204505.76884: Evaluated conditional (network_provider == "nm"): True 44842 1727204505.76989: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204505.77065: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204505.77190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204505.78903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204505.78979: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204505.79019: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204505.79062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204505.79099: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204505.79192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204505.79224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204505.79254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204505.79305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204505.79328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204505.79378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204505.79412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204505.79443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204505.79487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204505.79506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204505.79556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204505.79585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204505.79612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204505.79660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204505.79684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204505.79830: variable 'network_connections' from source: task vars 44842 1727204505.79851: variable 'interface' from source: set_fact 44842 1727204505.79919: variable 'interface' from source: set_fact 44842 1727204505.79929: variable 'interface' from source: set_fact 44842 1727204505.80008: variable 'interface' from source: set_fact 44842 1727204505.80120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204505.80347: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204505.80355: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204505.80392: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204505.80421: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204505.80477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204505.80500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204505.80526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204505.80563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204505.80615: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204505.80895: variable 'network_connections' from source: task vars 44842 1727204505.80898: variable 'interface' from source: set_fact 44842 1727204505.80959: variable 'interface' from source: set_fact 44842 1727204505.80970: variable 'interface' from source: set_fact 44842 1727204505.81068: variable 'interface' from source: set_fact 44842 1727204505.81216: Evaluated conditional (__network_wpa_supplicant_required): False 44842 1727204505.81219: when evaluation is False, skipping this task 44842 1727204505.81221: _execute() done 44842 1727204505.81233: dumping result to json 44842 1727204505.81235: done dumping result, returning 44842 1727204505.81237: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-aad0-d242-000000000024] 44842 1727204505.81241: sending task result for task 0affcd87-79f5-aad0-d242-000000000024 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44842 1727204505.81392: no more pending results, returning what we have 44842 1727204505.81397: results queue empty 44842 1727204505.81399: checking for any_errors_fatal 44842 1727204505.81426: done checking for any_errors_fatal 44842 1727204505.81427: checking for max_fail_percentage 44842 1727204505.81429: done checking for max_fail_percentage 44842 1727204505.81430: checking to see if all hosts have failed and the running result is not ok 44842 1727204505.81431: done checking to see if all hosts have failed 44842 1727204505.81432: getting the remaining hosts for this loop 44842 1727204505.81433: done getting the remaining hosts for this loop 44842 1727204505.81471: getting the next task for host managed-node1 44842 1727204505.81478: done getting next task for host managed-node1 44842 1727204505.81482: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44842 1727204505.81484: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204505.81503: getting variables 44842 1727204505.81505: in VariableManager get_vars() 44842 1727204505.81544: Calling all_inventory to load vars for managed-node1 44842 1727204505.81547: Calling groups_inventory to load vars for managed-node1 44842 1727204505.81549: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204505.81580: Calling all_plugins_play to load vars for managed-node1 44842 1727204505.81583: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204505.81588: Calling groups_plugins_play to load vars for managed-node1 44842 1727204505.82107: done sending task result for task 0affcd87-79f5-aad0-d242-000000000024 44842 1727204505.82111: WORKER PROCESS EXITING 44842 1727204505.82440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204505.83635: done with get_vars() 44842 1727204505.83666: done getting variables 44842 1727204505.83725: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.085) 0:00:16.005 ***** 44842 1727204505.83757: entering _queue_task() for managed-node1/service 44842 1727204505.84165: worker is 1 (out of 1 available) 44842 1727204505.84178: exiting _queue_task() for managed-node1/service 44842 1727204505.84199: done queuing things up, now waiting for results queue to drain 44842 1727204505.84201: waiting for pending results... 44842 1727204505.84485: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 44842 1727204505.84572: in run() - task 0affcd87-79f5-aad0-d242-000000000025 44842 1727204505.84583: variable 'ansible_search_path' from source: unknown 44842 1727204505.84587: variable 'ansible_search_path' from source: unknown 44842 1727204505.84616: calling self._execute() 44842 1727204505.84689: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204505.84692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204505.84701: variable 'omit' from source: magic vars 44842 1727204505.84973: variable 'ansible_distribution_major_version' from source: facts 44842 1727204505.84986: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204505.85068: variable 'network_provider' from source: set_fact 44842 1727204505.85072: Evaluated conditional (network_provider == "initscripts"): False 44842 1727204505.85074: when evaluation is False, skipping this task 44842 1727204505.85077: _execute() done 44842 1727204505.85079: dumping result to json 44842 1727204505.85081: done dumping result, returning 44842 1727204505.85089: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-aad0-d242-000000000025] 44842 1727204505.85096: sending task result for task 0affcd87-79f5-aad0-d242-000000000025 44842 1727204505.85183: done sending task result for task 0affcd87-79f5-aad0-d242-000000000025 44842 1727204505.85186: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204505.85235: no more pending results, returning what we have 44842 1727204505.85239: results queue empty 44842 1727204505.85240: checking for any_errors_fatal 44842 1727204505.85248: done checking for any_errors_fatal 44842 1727204505.85249: checking for max_fail_percentage 44842 1727204505.85250: done checking for max_fail_percentage 44842 1727204505.85251: checking to see if all hosts have failed and the running result is not ok 44842 1727204505.85252: done checking to see if all hosts have failed 44842 1727204505.85253: getting the remaining hosts for this loop 44842 1727204505.85254: done getting the remaining hosts for this loop 44842 1727204505.85258: getting the next task for host managed-node1 44842 1727204505.85269: done getting next task for host managed-node1 44842 1727204505.85272: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44842 1727204505.85275: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204505.85291: getting variables 44842 1727204505.85292: in VariableManager get_vars() 44842 1727204505.85326: Calling all_inventory to load vars for managed-node1 44842 1727204505.85329: Calling groups_inventory to load vars for managed-node1 44842 1727204505.85331: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204505.85339: Calling all_plugins_play to load vars for managed-node1 44842 1727204505.85342: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204505.85344: Calling groups_plugins_play to load vars for managed-node1 44842 1727204505.86272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204505.87197: done with get_vars() 44842 1727204505.87213: done getting variables 44842 1727204505.87257: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.035) 0:00:16.040 ***** 44842 1727204505.87283: entering _queue_task() for managed-node1/copy 44842 1727204505.87498: worker is 1 (out of 1 available) 44842 1727204505.87512: exiting _queue_task() for managed-node1/copy 44842 1727204505.87524: done queuing things up, now waiting for results queue to drain 44842 1727204505.87525: waiting for pending results... 44842 1727204505.87697: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44842 1727204505.87785: in run() - task 0affcd87-79f5-aad0-d242-000000000026 44842 1727204505.87797: variable 'ansible_search_path' from source: unknown 44842 1727204505.87801: variable 'ansible_search_path' from source: unknown 44842 1727204505.87827: calling self._execute() 44842 1727204505.87899: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204505.87903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204505.87911: variable 'omit' from source: magic vars 44842 1727204505.88179: variable 'ansible_distribution_major_version' from source: facts 44842 1727204505.88190: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204505.88271: variable 'network_provider' from source: set_fact 44842 1727204505.88275: Evaluated conditional (network_provider == "initscripts"): False 44842 1727204505.88277: when evaluation is False, skipping this task 44842 1727204505.88282: _execute() done 44842 1727204505.88284: dumping result to json 44842 1727204505.88287: done dumping result, returning 44842 1727204505.88294: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-aad0-d242-000000000026] 44842 1727204505.88303: sending task result for task 0affcd87-79f5-aad0-d242-000000000026 44842 1727204505.88393: done sending task result for task 0affcd87-79f5-aad0-d242-000000000026 44842 1727204505.88395: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44842 1727204505.88455: no more pending results, returning what we have 44842 1727204505.88458: results queue empty 44842 1727204505.88462: checking for any_errors_fatal 44842 1727204505.88474: done checking for any_errors_fatal 44842 1727204505.88475: checking for max_fail_percentage 44842 1727204505.88477: done checking for max_fail_percentage 44842 1727204505.88477: checking to see if all hosts have failed and the running result is not ok 44842 1727204505.88478: done checking to see if all hosts have failed 44842 1727204505.88479: getting the remaining hosts for this loop 44842 1727204505.88481: done getting the remaining hosts for this loop 44842 1727204505.88484: getting the next task for host managed-node1 44842 1727204505.88489: done getting next task for host managed-node1 44842 1727204505.88493: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44842 1727204505.88495: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204505.88508: getting variables 44842 1727204505.88510: in VariableManager get_vars() 44842 1727204505.88547: Calling all_inventory to load vars for managed-node1 44842 1727204505.88549: Calling groups_inventory to load vars for managed-node1 44842 1727204505.88551: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204505.88557: Calling all_plugins_play to load vars for managed-node1 44842 1727204505.88559: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204505.88565: Calling groups_plugins_play to load vars for managed-node1 44842 1727204505.89345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204505.90282: done with get_vars() 44842 1727204505.90297: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.030) 0:00:16.071 ***** 44842 1727204505.90354: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44842 1727204505.90355: Creating lock for fedora.linux_system_roles.network_connections 44842 1727204505.90575: worker is 1 (out of 1 available) 44842 1727204505.90588: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44842 1727204505.90599: done queuing things up, now waiting for results queue to drain 44842 1727204505.90600: waiting for pending results... 44842 1727204505.90778: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44842 1727204505.90866: in run() - task 0affcd87-79f5-aad0-d242-000000000027 44842 1727204505.90876: variable 'ansible_search_path' from source: unknown 44842 1727204505.90879: variable 'ansible_search_path' from source: unknown 44842 1727204505.90906: calling self._execute() 44842 1727204505.90976: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204505.90980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204505.90988: variable 'omit' from source: magic vars 44842 1727204505.91258: variable 'ansible_distribution_major_version' from source: facts 44842 1727204505.91270: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204505.91275: variable 'omit' from source: magic vars 44842 1727204505.91312: variable 'omit' from source: magic vars 44842 1727204505.91423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204505.93169: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204505.93213: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204505.93242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204505.93268: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204505.93288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204505.93345: variable 'network_provider' from source: set_fact 44842 1727204505.93439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204505.93459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204505.93479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204505.93506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204505.93517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204505.93572: variable 'omit' from source: magic vars 44842 1727204505.93650: variable 'omit' from source: magic vars 44842 1727204505.93722: variable 'network_connections' from source: task vars 44842 1727204505.93732: variable 'interface' from source: set_fact 44842 1727204505.93784: variable 'interface' from source: set_fact 44842 1727204505.93791: variable 'interface' from source: set_fact 44842 1727204505.93832: variable 'interface' from source: set_fact 44842 1727204505.94063: variable 'omit' from source: magic vars 44842 1727204505.94067: variable '__lsr_ansible_managed' from source: task vars 44842 1727204505.94113: variable '__lsr_ansible_managed' from source: task vars 44842 1727204505.94283: Loaded config def from plugin (lookup/template) 44842 1727204505.94289: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44842 1727204505.94311: File lookup term: get_ansible_managed.j2 44842 1727204505.94315: variable 'ansible_search_path' from source: unknown 44842 1727204505.94318: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44842 1727204505.94329: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44842 1727204505.94341: variable 'ansible_search_path' from source: unknown 44842 1727204505.97675: variable 'ansible_managed' from source: unknown 44842 1727204505.97758: variable 'omit' from source: magic vars 44842 1727204505.97781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204505.97802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204505.97816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204505.97828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204505.97836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204505.97857: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204505.97863: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204505.97867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204505.97925: Set connection var ansible_shell_type to sh 44842 1727204505.97933: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204505.97938: Set connection var ansible_connection to ssh 44842 1727204505.97943: Set connection var ansible_pipelining to False 44842 1727204505.97948: Set connection var ansible_timeout to 10 44842 1727204505.97954: Set connection var ansible_shell_executable to /bin/sh 44842 1727204505.97973: variable 'ansible_shell_executable' from source: unknown 44842 1727204505.97976: variable 'ansible_connection' from source: unknown 44842 1727204505.97979: variable 'ansible_module_compression' from source: unknown 44842 1727204505.97981: variable 'ansible_shell_type' from source: unknown 44842 1727204505.97985: variable 'ansible_shell_executable' from source: unknown 44842 1727204505.97988: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204505.97990: variable 'ansible_pipelining' from source: unknown 44842 1727204505.97992: variable 'ansible_timeout' from source: unknown 44842 1727204505.97994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204505.98085: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204505.98097: variable 'omit' from source: magic vars 44842 1727204505.98109: starting attempt loop 44842 1727204505.98113: running the handler 44842 1727204505.98120: _low_level_execute_command(): starting 44842 1727204505.98126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204505.98635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204505.98662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.98681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204505.98692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204505.98735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204505.98740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204505.98751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204505.98824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204506.00366: stdout chunk (state=3): >>>/root <<< 44842 1727204506.00473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204506.00531: stderr chunk (state=3): >>><<< 44842 1727204506.00534: stdout chunk (state=3): >>><<< 44842 1727204506.00552: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204506.00565: _low_level_execute_command(): starting 44842 1727204506.00572: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919 `" && echo ansible-tmp-1727204506.0055306-46182-265682951686919="` echo /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919 `" ) && sleep 0' 44842 1727204506.01024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.01039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.01061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204506.01076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.01086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.01127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204506.01144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204506.01214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204506.03068: stdout chunk (state=3): >>>ansible-tmp-1727204506.0055306-46182-265682951686919=/root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919 <<< 44842 1727204506.03221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204506.03301: stderr chunk (state=3): >>><<< 44842 1727204506.03389: stdout chunk (state=3): >>><<< 44842 1727204506.03617: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204506.0055306-46182-265682951686919=/root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204506.03690: variable 'ansible_module_compression' from source: unknown 44842 1727204506.03731: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 44842 1727204506.03734: ANSIBALLZ: Acquiring lock 44842 1727204506.03737: ANSIBALLZ: Lock acquired: 140164880316416 44842 1727204506.03739: ANSIBALLZ: Creating module 44842 1727204506.30894: ANSIBALLZ: Writing module into payload 44842 1727204506.31424: ANSIBALLZ: Writing module 44842 1727204506.31454: ANSIBALLZ: Renaming module 44842 1727204506.31458: ANSIBALLZ: Done creating module 44842 1727204506.31489: variable 'ansible_facts' from source: unknown 44842 1727204506.31597: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919/AnsiballZ_network_connections.py 44842 1727204506.31752: Sending initial data 44842 1727204506.31755: Sent initial data (168 bytes) 44842 1727204506.32986: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204506.32996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204506.33007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.33020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.33081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204506.33086: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204506.33097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.33110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204506.33117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204506.33125: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204506.33132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204506.33141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.33154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.33159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204506.33171: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204506.33189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.33262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204506.33287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204506.33300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204506.33392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204506.35129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204506.35180: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204506.35234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp5celoy_i /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919/AnsiballZ_network_connections.py <<< 44842 1727204506.35292: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204506.36971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204506.37148: stderr chunk (state=3): >>><<< 44842 1727204506.37151: stdout chunk (state=3): >>><<< 44842 1727204506.37154: done transferring module to remote 44842 1727204506.37156: _low_level_execute_command(): starting 44842 1727204506.37157: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919/ /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919/AnsiballZ_network_connections.py && sleep 0' 44842 1727204506.37730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204506.37748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204506.37772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.37799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.37841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204506.37853: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204506.37872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.37890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204506.37902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204506.37913: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204506.37925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204506.37939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.37954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.37972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204506.37984: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204506.37999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.38080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204506.38108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204506.38111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204506.38843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204506.40584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204506.40668: stderr chunk (state=3): >>><<< 44842 1727204506.40672: stdout chunk (state=3): >>><<< 44842 1727204506.40774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204506.40778: _low_level_execute_command(): starting 44842 1727204506.40781: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919/AnsiballZ_network_connections.py && sleep 0' 44842 1727204506.41377: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204506.41391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204506.41405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.41421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.41473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204506.41486: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204506.41499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.41515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204506.41526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204506.41535: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204506.41548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204506.41570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.41587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.41601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204506.41613: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204506.41629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.41708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204506.41726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204506.41742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204506.42447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204506.75370: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44842 1727204506.76789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204506.76869: stderr chunk (state=3): >>><<< 44842 1727204506.76883: stdout chunk (state=3): >>><<< 44842 1727204506.76902: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204506.77035: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26', '2001:db8::2/32'], 'route': [{'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '2001:db8::4', 'prefix': 32, 'gateway': '2001:db8::1', 'metric': 2, 'table': 30600}], 'routing_rule': [{'priority': 30200, 'from': '198.51.100.58/26', 'table': 30200}, {'priority': 30201, 'family': 'ipv4', 'fwmark': 1, 'fwmask': 1, 'table': 30200}, {'priority': 30202, 'family': 'ipv4', 'ipproto': 6, 'table': 30200}, {'priority': 30203, 'family': 'ipv4', 'sport': '128 - 256', 'table': 30200}, {'priority': 30204, 'family': 'ipv4', 'tos': 8, 'table': 30200}, {'priority': 30400, 'to': '198.51.100.128/26', 'table': 30400}, {'priority': 30401, 'family': 'ipv4', 'iif': 'iiftest', 'table': 30400}, {'priority': 30402, 'family': 'ipv4', 'oif': 'oiftest', 'table': 30400}, {'priority': 30403, 'from': '0.0.0.0/0', 'to': '0.0.0.0/0', 'table': 30400}, {'priority': 30600, 'to': '2001:db8::4/32', 'table': 30600}, {'priority': 30601, 'family': 'ipv6', 'dport': '128 - 256', 'invert': True, 'table': 30600}, {'priority': 30602, 'from': '::/0', 'to': '::/0', 'table': 30600}, {'priority': 200, 'from': '198.51.100.56/26', 'table': 'custom'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204506.77044: _low_level_execute_command(): starting 44842 1727204506.77048: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204506.0055306-46182-265682951686919/ > /dev/null 2>&1 && sleep 0' 44842 1727204506.77557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204506.77565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.77596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.77610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204506.77620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204506.77674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204506.77680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204506.77688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204506.77756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204506.79577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204506.79664: stderr chunk (state=3): >>><<< 44842 1727204506.79667: stdout chunk (state=3): >>><<< 44842 1727204506.79688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204506.79693: handler run complete 44842 1727204506.79768: attempt loop complete, returning result 44842 1727204506.79774: _execute() done 44842 1727204506.79778: dumping result to json 44842 1727204506.79791: done dumping result, returning 44842 1727204506.79799: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-aad0-d242-000000000027] 44842 1727204506.79804: sending task result for task 0affcd87-79f5-aad0-d242-000000000027 44842 1727204506.79983: done sending task result for task 0affcd87-79f5-aad0-d242-000000000027 44842 1727204506.79986: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33 (not-active) 44842 1727204506.80248: no more pending results, returning what we have 44842 1727204506.80252: results queue empty 44842 1727204506.80253: checking for any_errors_fatal 44842 1727204506.80258: done checking for any_errors_fatal 44842 1727204506.80259: checking for max_fail_percentage 44842 1727204506.80261: done checking for max_fail_percentage 44842 1727204506.80261: checking to see if all hosts have failed and the running result is not ok 44842 1727204506.80262: done checking to see if all hosts have failed 44842 1727204506.80263: getting the remaining hosts for this loop 44842 1727204506.80266: done getting the remaining hosts for this loop 44842 1727204506.80269: getting the next task for host managed-node1 44842 1727204506.80275: done getting next task for host managed-node1 44842 1727204506.80279: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44842 1727204506.80281: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204506.80292: getting variables 44842 1727204506.80293: in VariableManager get_vars() 44842 1727204506.80329: Calling all_inventory to load vars for managed-node1 44842 1727204506.80331: Calling groups_inventory to load vars for managed-node1 44842 1727204506.80333: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204506.80342: Calling all_plugins_play to load vars for managed-node1 44842 1727204506.80344: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204506.80346: Calling groups_plugins_play to load vars for managed-node1 44842 1727204506.82063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204506.83321: done with get_vars() 44842 1727204506.83341: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:46 -0400 (0:00:00.930) 0:00:17.002 ***** 44842 1727204506.83408: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44842 1727204506.83409: Creating lock for fedora.linux_system_roles.network_state 44842 1727204506.83655: worker is 1 (out of 1 available) 44842 1727204506.83676: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44842 1727204506.83689: done queuing things up, now waiting for results queue to drain 44842 1727204506.83690: waiting for pending results... 44842 1727204506.83877: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 44842 1727204506.83966: in run() - task 0affcd87-79f5-aad0-d242-000000000028 44842 1727204506.83977: variable 'ansible_search_path' from source: unknown 44842 1727204506.83981: variable 'ansible_search_path' from source: unknown 44842 1727204506.84010: calling self._execute() 44842 1727204506.84082: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204506.84086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204506.84094: variable 'omit' from source: magic vars 44842 1727204506.84399: variable 'ansible_distribution_major_version' from source: facts 44842 1727204506.84422: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204506.84566: variable 'network_state' from source: role '' defaults 44842 1727204506.84583: Evaluated conditional (network_state != {}): False 44842 1727204506.84591: when evaluation is False, skipping this task 44842 1727204506.84599: _execute() done 44842 1727204506.84606: dumping result to json 44842 1727204506.84613: done dumping result, returning 44842 1727204506.84623: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-aad0-d242-000000000028] 44842 1727204506.84640: sending task result for task 0affcd87-79f5-aad0-d242-000000000028 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204506.84804: no more pending results, returning what we have 44842 1727204506.84808: results queue empty 44842 1727204506.84810: checking for any_errors_fatal 44842 1727204506.84830: done checking for any_errors_fatal 44842 1727204506.84831: checking for max_fail_percentage 44842 1727204506.84833: done checking for max_fail_percentage 44842 1727204506.84834: checking to see if all hosts have failed and the running result is not ok 44842 1727204506.84835: done checking to see if all hosts have failed 44842 1727204506.84836: getting the remaining hosts for this loop 44842 1727204506.84839: done getting the remaining hosts for this loop 44842 1727204506.84844: getting the next task for host managed-node1 44842 1727204506.84856: done getting next task for host managed-node1 44842 1727204506.84863: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44842 1727204506.84868: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204506.84920: getting variables 44842 1727204506.84922: in VariableManager get_vars() 44842 1727204506.84970: Calling all_inventory to load vars for managed-node1 44842 1727204506.84975: Calling groups_inventory to load vars for managed-node1 44842 1727204506.84978: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204506.84990: Calling all_plugins_play to load vars for managed-node1 44842 1727204506.84998: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204506.85002: Calling groups_plugins_play to load vars for managed-node1 44842 1727204506.85584: done sending task result for task 0affcd87-79f5-aad0-d242-000000000028 44842 1727204506.85588: WORKER PROCESS EXITING 44842 1727204506.87424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204506.89481: done with get_vars() 44842 1727204506.89503: done getting variables 44842 1727204506.89557: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:46 -0400 (0:00:00.061) 0:00:17.064 ***** 44842 1727204506.89597: entering _queue_task() for managed-node1/debug 44842 1727204506.89940: worker is 1 (out of 1 available) 44842 1727204506.89952: exiting _queue_task() for managed-node1/debug 44842 1727204506.89970: done queuing things up, now waiting for results queue to drain 44842 1727204506.89971: waiting for pending results... 44842 1727204506.90277: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44842 1727204506.90407: in run() - task 0affcd87-79f5-aad0-d242-000000000029 44842 1727204506.90429: variable 'ansible_search_path' from source: unknown 44842 1727204506.90433: variable 'ansible_search_path' from source: unknown 44842 1727204506.90476: calling self._execute() 44842 1727204506.90574: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204506.90578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204506.90589: variable 'omit' from source: magic vars 44842 1727204506.91003: variable 'ansible_distribution_major_version' from source: facts 44842 1727204506.91015: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204506.91022: variable 'omit' from source: magic vars 44842 1727204506.91094: variable 'omit' from source: magic vars 44842 1727204506.91134: variable 'omit' from source: magic vars 44842 1727204506.91185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204506.91225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204506.91247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204506.91270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204506.91282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204506.91317: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204506.91324: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204506.91327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204506.91435: Set connection var ansible_shell_type to sh 44842 1727204506.91447: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204506.91451: Set connection var ansible_connection to ssh 44842 1727204506.91457: Set connection var ansible_pipelining to False 44842 1727204506.91467: Set connection var ansible_timeout to 10 44842 1727204506.91475: Set connection var ansible_shell_executable to /bin/sh 44842 1727204506.91498: variable 'ansible_shell_executable' from source: unknown 44842 1727204506.91501: variable 'ansible_connection' from source: unknown 44842 1727204506.91509: variable 'ansible_module_compression' from source: unknown 44842 1727204506.91512: variable 'ansible_shell_type' from source: unknown 44842 1727204506.91515: variable 'ansible_shell_executable' from source: unknown 44842 1727204506.91517: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204506.91522: variable 'ansible_pipelining' from source: unknown 44842 1727204506.91524: variable 'ansible_timeout' from source: unknown 44842 1727204506.91529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204506.91691: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204506.91702: variable 'omit' from source: magic vars 44842 1727204506.91707: starting attempt loop 44842 1727204506.91710: running the handler 44842 1727204506.91857: variable '__network_connections_result' from source: set_fact 44842 1727204506.91937: handler run complete 44842 1727204506.91962: attempt loop complete, returning result 44842 1727204506.91968: _execute() done 44842 1727204506.91972: dumping result to json 44842 1727204506.91974: done dumping result, returning 44842 1727204506.91989: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-aad0-d242-000000000029] 44842 1727204506.91994: sending task result for task 0affcd87-79f5-aad0-d242-000000000029 44842 1727204506.92090: done sending task result for task 0affcd87-79f5-aad0-d242-000000000029 44842 1727204506.92094: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33 (not-active)" ] } 44842 1727204506.92163: no more pending results, returning what we have 44842 1727204506.92169: results queue empty 44842 1727204506.92170: checking for any_errors_fatal 44842 1727204506.92176: done checking for any_errors_fatal 44842 1727204506.92177: checking for max_fail_percentage 44842 1727204506.92179: done checking for max_fail_percentage 44842 1727204506.92180: checking to see if all hosts have failed and the running result is not ok 44842 1727204506.92181: done checking to see if all hosts have failed 44842 1727204506.92181: getting the remaining hosts for this loop 44842 1727204506.92183: done getting the remaining hosts for this loop 44842 1727204506.92188: getting the next task for host managed-node1 44842 1727204506.92196: done getting next task for host managed-node1 44842 1727204506.92200: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44842 1727204506.92204: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204506.92218: getting variables 44842 1727204506.92222: in VariableManager get_vars() 44842 1727204506.92269: Calling all_inventory to load vars for managed-node1 44842 1727204506.92273: Calling groups_inventory to load vars for managed-node1 44842 1727204506.92276: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204506.92287: Calling all_plugins_play to load vars for managed-node1 44842 1727204506.92290: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204506.92293: Calling groups_plugins_play to load vars for managed-node1 44842 1727204506.94089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204506.95895: done with get_vars() 44842 1727204506.95928: done getting variables 44842 1727204506.95999: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:46 -0400 (0:00:00.064) 0:00:17.128 ***** 44842 1727204506.96033: entering _queue_task() for managed-node1/debug 44842 1727204506.96383: worker is 1 (out of 1 available) 44842 1727204506.96396: exiting _queue_task() for managed-node1/debug 44842 1727204506.96408: done queuing things up, now waiting for results queue to drain 44842 1727204506.96409: waiting for pending results... 44842 1727204506.96728: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44842 1727204506.96971: in run() - task 0affcd87-79f5-aad0-d242-00000000002a 44842 1727204506.96976: variable 'ansible_search_path' from source: unknown 44842 1727204506.96979: variable 'ansible_search_path' from source: unknown 44842 1727204506.96983: calling self._execute() 44842 1727204506.97059: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204506.97069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204506.97080: variable 'omit' from source: magic vars 44842 1727204506.97484: variable 'ansible_distribution_major_version' from source: facts 44842 1727204506.97496: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204506.97502: variable 'omit' from source: magic vars 44842 1727204506.97597: variable 'omit' from source: magic vars 44842 1727204506.97632: variable 'omit' from source: magic vars 44842 1727204506.97686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204506.97720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204506.97742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204506.97769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204506.97786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204506.97816: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204506.97819: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204506.97821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204506.97934: Set connection var ansible_shell_type to sh 44842 1727204506.97944: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204506.97949: Set connection var ansible_connection to ssh 44842 1727204506.97955: Set connection var ansible_pipelining to False 44842 1727204506.97961: Set connection var ansible_timeout to 10 44842 1727204506.97978: Set connection var ansible_shell_executable to /bin/sh 44842 1727204506.98005: variable 'ansible_shell_executable' from source: unknown 44842 1727204506.98008: variable 'ansible_connection' from source: unknown 44842 1727204506.98011: variable 'ansible_module_compression' from source: unknown 44842 1727204506.98013: variable 'ansible_shell_type' from source: unknown 44842 1727204506.98015: variable 'ansible_shell_executable' from source: unknown 44842 1727204506.98018: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204506.98020: variable 'ansible_pipelining' from source: unknown 44842 1727204506.98024: variable 'ansible_timeout' from source: unknown 44842 1727204506.98028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204506.98179: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204506.98193: variable 'omit' from source: magic vars 44842 1727204506.98198: starting attempt loop 44842 1727204506.98201: running the handler 44842 1727204506.98251: variable '__network_connections_result' from source: set_fact 44842 1727204506.98344: variable '__network_connections_result' from source: set_fact 44842 1727204506.98768: handler run complete 44842 1727204506.98847: attempt loop complete, returning result 44842 1727204506.98850: _execute() done 44842 1727204506.98853: dumping result to json 44842 1727204506.98868: done dumping result, returning 44842 1727204506.98879: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-aad0-d242-00000000002a] 44842 1727204506.98884: sending task result for task 0affcd87-79f5-aad0-d242-00000000002a 44842 1727204506.99009: done sending task result for task 0affcd87-79f5-aad0-d242-00000000002a 44842 1727204506.99013: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 7ac3bb1f-688e-4ad4-89b8-fb40a9966f33 (not-active)" ] } } 44842 1727204506.99200: no more pending results, returning what we have 44842 1727204506.99204: results queue empty 44842 1727204506.99206: checking for any_errors_fatal 44842 1727204506.99213: done checking for any_errors_fatal 44842 1727204506.99214: checking for max_fail_percentage 44842 1727204506.99216: done checking for max_fail_percentage 44842 1727204506.99217: checking to see if all hosts have failed and the running result is not ok 44842 1727204506.99218: done checking to see if all hosts have failed 44842 1727204506.99219: getting the remaining hosts for this loop 44842 1727204506.99221: done getting the remaining hosts for this loop 44842 1727204506.99225: getting the next task for host managed-node1 44842 1727204506.99233: done getting next task for host managed-node1 44842 1727204506.99239: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44842 1727204506.99243: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204506.99256: getting variables 44842 1727204506.99258: in VariableManager get_vars() 44842 1727204506.99306: Calling all_inventory to load vars for managed-node1 44842 1727204506.99309: Calling groups_inventory to load vars for managed-node1 44842 1727204506.99312: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204506.99324: Calling all_plugins_play to load vars for managed-node1 44842 1727204506.99327: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204506.99331: Calling groups_plugins_play to load vars for managed-node1 44842 1727204507.01256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204507.03106: done with get_vars() 44842 1727204507.03137: done getting variables 44842 1727204507.03207: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:47 -0400 (0:00:00.072) 0:00:17.200 ***** 44842 1727204507.03240: entering _queue_task() for managed-node1/debug 44842 1727204507.03590: worker is 1 (out of 1 available) 44842 1727204507.03604: exiting _queue_task() for managed-node1/debug 44842 1727204507.03620: done queuing things up, now waiting for results queue to drain 44842 1727204507.03622: waiting for pending results... 44842 1727204507.03938: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44842 1727204507.04090: in run() - task 0affcd87-79f5-aad0-d242-00000000002b 44842 1727204507.04105: variable 'ansible_search_path' from source: unknown 44842 1727204507.04109: variable 'ansible_search_path' from source: unknown 44842 1727204507.04142: calling self._execute() 44842 1727204507.04238: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204507.04243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204507.04252: variable 'omit' from source: magic vars 44842 1727204507.04638: variable 'ansible_distribution_major_version' from source: facts 44842 1727204507.04651: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204507.04787: variable 'network_state' from source: role '' defaults 44842 1727204507.04797: Evaluated conditional (network_state != {}): False 44842 1727204507.04801: when evaluation is False, skipping this task 44842 1727204507.04803: _execute() done 44842 1727204507.04806: dumping result to json 44842 1727204507.04808: done dumping result, returning 44842 1727204507.04822: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-aad0-d242-00000000002b] 44842 1727204507.04834: sending task result for task 0affcd87-79f5-aad0-d242-00000000002b 44842 1727204507.04926: done sending task result for task 0affcd87-79f5-aad0-d242-00000000002b 44842 1727204507.04929: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 44842 1727204507.04986: no more pending results, returning what we have 44842 1727204507.04990: results queue empty 44842 1727204507.04991: checking for any_errors_fatal 44842 1727204507.05006: done checking for any_errors_fatal 44842 1727204507.05007: checking for max_fail_percentage 44842 1727204507.05009: done checking for max_fail_percentage 44842 1727204507.05010: checking to see if all hosts have failed and the running result is not ok 44842 1727204507.05011: done checking to see if all hosts have failed 44842 1727204507.05012: getting the remaining hosts for this loop 44842 1727204507.05015: done getting the remaining hosts for this loop 44842 1727204507.05019: getting the next task for host managed-node1 44842 1727204507.05029: done getting next task for host managed-node1 44842 1727204507.05033: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44842 1727204507.05037: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204507.05054: getting variables 44842 1727204507.05056: in VariableManager get_vars() 44842 1727204507.05101: Calling all_inventory to load vars for managed-node1 44842 1727204507.05105: Calling groups_inventory to load vars for managed-node1 44842 1727204507.05107: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204507.05120: Calling all_plugins_play to load vars for managed-node1 44842 1727204507.05123: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204507.05126: Calling groups_plugins_play to load vars for managed-node1 44842 1727204507.06846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204507.08654: done with get_vars() 44842 1727204507.08692: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:47 -0400 (0:00:00.055) 0:00:17.256 ***** 44842 1727204507.08799: entering _queue_task() for managed-node1/ping 44842 1727204507.08801: Creating lock for ping 44842 1727204507.09157: worker is 1 (out of 1 available) 44842 1727204507.09178: exiting _queue_task() for managed-node1/ping 44842 1727204507.09191: done queuing things up, now waiting for results queue to drain 44842 1727204507.09192: waiting for pending results... 44842 1727204507.09507: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 44842 1727204507.09654: in run() - task 0affcd87-79f5-aad0-d242-00000000002c 44842 1727204507.09673: variable 'ansible_search_path' from source: unknown 44842 1727204507.09680: variable 'ansible_search_path' from source: unknown 44842 1727204507.09714: calling self._execute() 44842 1727204507.09817: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204507.09826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204507.09836: variable 'omit' from source: magic vars 44842 1727204507.10226: variable 'ansible_distribution_major_version' from source: facts 44842 1727204507.10238: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204507.10245: variable 'omit' from source: magic vars 44842 1727204507.10317: variable 'omit' from source: magic vars 44842 1727204507.10352: variable 'omit' from source: magic vars 44842 1727204507.10403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204507.10439: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204507.10460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204507.10488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204507.10500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204507.10537: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204507.10541: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204507.10543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204507.10650: Set connection var ansible_shell_type to sh 44842 1727204507.10661: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204507.10671: Set connection var ansible_connection to ssh 44842 1727204507.10677: Set connection var ansible_pipelining to False 44842 1727204507.10683: Set connection var ansible_timeout to 10 44842 1727204507.10691: Set connection var ansible_shell_executable to /bin/sh 44842 1727204507.10718: variable 'ansible_shell_executable' from source: unknown 44842 1727204507.10721: variable 'ansible_connection' from source: unknown 44842 1727204507.10728: variable 'ansible_module_compression' from source: unknown 44842 1727204507.10731: variable 'ansible_shell_type' from source: unknown 44842 1727204507.10733: variable 'ansible_shell_executable' from source: unknown 44842 1727204507.10737: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204507.10741: variable 'ansible_pipelining' from source: unknown 44842 1727204507.10744: variable 'ansible_timeout' from source: unknown 44842 1727204507.10748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204507.10979: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204507.10988: variable 'omit' from source: magic vars 44842 1727204507.10994: starting attempt loop 44842 1727204507.10997: running the handler 44842 1727204507.11011: _low_level_execute_command(): starting 44842 1727204507.11019: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204507.11816: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.11829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.11845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.11860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.11908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.11915: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204507.11925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.11939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.11949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.11957: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.11971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.11986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.11998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.12007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.12017: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.12027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.12108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.12132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.12144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.12232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.13890: stdout chunk (state=3): >>>/root <<< 44842 1727204507.14082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.14089: stdout chunk (state=3): >>><<< 44842 1727204507.14098: stderr chunk (state=3): >>><<< 44842 1727204507.14127: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204507.14141: _low_level_execute_command(): starting 44842 1727204507.14147: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480 `" && echo ansible-tmp-1727204507.1412666-46227-168204353320480="` echo /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480 `" ) && sleep 0' 44842 1727204507.15342: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.15346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.16117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204507.16123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.16141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204507.16145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.16252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.16258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.16312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.18187: stdout chunk (state=3): >>>ansible-tmp-1727204507.1412666-46227-168204353320480=/root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480 <<< 44842 1727204507.18391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.18396: stdout chunk (state=3): >>><<< 44842 1727204507.18399: stderr chunk (state=3): >>><<< 44842 1727204507.18572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204507.1412666-46227-168204353320480=/root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204507.18577: variable 'ansible_module_compression' from source: unknown 44842 1727204507.18579: ANSIBALLZ: Using lock for ping 44842 1727204507.18581: ANSIBALLZ: Acquiring lock 44842 1727204507.18583: ANSIBALLZ: Lock acquired: 140164880215568 44842 1727204507.18585: ANSIBALLZ: Creating module 44842 1727204507.35108: ANSIBALLZ: Writing module into payload 44842 1727204507.35190: ANSIBALLZ: Writing module 44842 1727204507.35222: ANSIBALLZ: Renaming module 44842 1727204507.35234: ANSIBALLZ: Done creating module 44842 1727204507.35256: variable 'ansible_facts' from source: unknown 44842 1727204507.35337: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480/AnsiballZ_ping.py 44842 1727204507.35522: Sending initial data 44842 1727204507.35525: Sent initial data (153 bytes) 44842 1727204507.37255: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.37262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.37296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.37300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.37302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.37363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.37386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.37505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.37593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.39319: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204507.39372: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204507.39421: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmppiol0k21 /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480/AnsiballZ_ping.py <<< 44842 1727204507.39501: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204507.40723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.41140: stderr chunk (state=3): >>><<< 44842 1727204507.41144: stdout chunk (state=3): >>><<< 44842 1727204507.41146: done transferring module to remote 44842 1727204507.41148: _low_level_execute_command(): starting 44842 1727204507.41150: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480/ /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480/AnsiballZ_ping.py && sleep 0' 44842 1727204507.42571: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.42588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.42609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.42638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.42685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.42697: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.42723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.42746: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.42763: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.42779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.42831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.42859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.42890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.42894: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.42896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.42988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.43013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.43031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.43088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.44796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.44858: stderr chunk (state=3): >>><<< 44842 1727204507.44862: stdout chunk (state=3): >>><<< 44842 1727204507.44879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204507.44884: _low_level_execute_command(): starting 44842 1727204507.44887: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480/AnsiballZ_ping.py && sleep 0' 44842 1727204507.45365: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.45369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.45409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.45413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.45423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.45483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.45487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.45553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.58402: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44842 1727204507.59286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204507.59360: stderr chunk (state=3): >>><<< 44842 1727204507.59365: stdout chunk (state=3): >>><<< 44842 1727204507.59389: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204507.59415: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204507.59424: _low_level_execute_command(): starting 44842 1727204507.59429: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204507.1412666-46227-168204353320480/ > /dev/null 2>&1 && sleep 0' 44842 1727204507.60100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.60106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.60117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.60131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.60175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.60183: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204507.60194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.60206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.60215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.60218: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.60227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.60236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.60248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.60255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.60266: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.60278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.60347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.60362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.60378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.60469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.62306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.62309: stdout chunk (state=3): >>><<< 44842 1727204507.62313: stderr chunk (state=3): >>><<< 44842 1727204507.62370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204507.62374: handler run complete 44842 1727204507.62376: attempt loop complete, returning result 44842 1727204507.62378: _execute() done 44842 1727204507.62380: dumping result to json 44842 1727204507.62382: done dumping result, returning 44842 1727204507.62570: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-aad0-d242-00000000002c] 44842 1727204507.62573: sending task result for task 0affcd87-79f5-aad0-d242-00000000002c 44842 1727204507.62642: done sending task result for task 0affcd87-79f5-aad0-d242-00000000002c 44842 1727204507.62646: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 44842 1727204507.62710: no more pending results, returning what we have 44842 1727204507.62714: results queue empty 44842 1727204507.62715: checking for any_errors_fatal 44842 1727204507.62721: done checking for any_errors_fatal 44842 1727204507.62722: checking for max_fail_percentage 44842 1727204507.62723: done checking for max_fail_percentage 44842 1727204507.62724: checking to see if all hosts have failed and the running result is not ok 44842 1727204507.62725: done checking to see if all hosts have failed 44842 1727204507.62726: getting the remaining hosts for this loop 44842 1727204507.62728: done getting the remaining hosts for this loop 44842 1727204507.62731: getting the next task for host managed-node1 44842 1727204507.62740: done getting next task for host managed-node1 44842 1727204507.62742: ^ task is: TASK: meta (role_complete) 44842 1727204507.62746: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204507.62757: getting variables 44842 1727204507.62759: in VariableManager get_vars() 44842 1727204507.62797: Calling all_inventory to load vars for managed-node1 44842 1727204507.62800: Calling groups_inventory to load vars for managed-node1 44842 1727204507.62802: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204507.62811: Calling all_plugins_play to load vars for managed-node1 44842 1727204507.62813: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204507.62816: Calling groups_plugins_play to load vars for managed-node1 44842 1727204507.64744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204507.66534: done with get_vars() 44842 1727204507.66569: done getting variables 44842 1727204507.66655: done queuing things up, now waiting for results queue to drain 44842 1727204507.66657: results queue empty 44842 1727204507.66658: checking for any_errors_fatal 44842 1727204507.66665: done checking for any_errors_fatal 44842 1727204507.66666: checking for max_fail_percentage 44842 1727204507.66667: done checking for max_fail_percentage 44842 1727204507.66668: checking to see if all hosts have failed and the running result is not ok 44842 1727204507.66669: done checking to see if all hosts have failed 44842 1727204507.66670: getting the remaining hosts for this loop 44842 1727204507.66671: done getting the remaining hosts for this loop 44842 1727204507.66674: getting the next task for host managed-node1 44842 1727204507.66679: done getting next task for host managed-node1 44842 1727204507.66681: ^ task is: TASK: Get the routing rule for looking up the table 30200 44842 1727204507.66683: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204507.66685: getting variables 44842 1727204507.66686: in VariableManager get_vars() 44842 1727204507.66698: Calling all_inventory to load vars for managed-node1 44842 1727204507.66700: Calling groups_inventory to load vars for managed-node1 44842 1727204507.66702: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204507.66707: Calling all_plugins_play to load vars for managed-node1 44842 1727204507.66709: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204507.66711: Calling groups_plugins_play to load vars for managed-node1 44842 1727204507.67974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204507.69682: done with get_vars() 44842 1727204507.69707: done getting variables 44842 1727204507.69752: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30200] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:115 Tuesday 24 September 2024 15:01:47 -0400 (0:00:00.609) 0:00:17.866 ***** 44842 1727204507.69793: entering _queue_task() for managed-node1/command 44842 1727204507.70134: worker is 1 (out of 1 available) 44842 1727204507.70147: exiting _queue_task() for managed-node1/command 44842 1727204507.70166: done queuing things up, now waiting for results queue to drain 44842 1727204507.70168: waiting for pending results... 44842 1727204507.70482: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30200 44842 1727204507.70579: in run() - task 0affcd87-79f5-aad0-d242-00000000005c 44842 1727204507.70593: variable 'ansible_search_path' from source: unknown 44842 1727204507.70637: calling self._execute() 44842 1727204507.70746: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204507.70751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204507.70771: variable 'omit' from source: magic vars 44842 1727204507.71204: variable 'ansible_distribution_major_version' from source: facts 44842 1727204507.71216: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204507.71344: variable 'ansible_distribution_major_version' from source: facts 44842 1727204507.71350: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204507.71357: variable 'omit' from source: magic vars 44842 1727204507.71387: variable 'omit' from source: magic vars 44842 1727204507.71429: variable 'omit' from source: magic vars 44842 1727204507.71479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204507.71520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204507.71546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204507.71566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204507.71580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204507.71615: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204507.71619: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204507.71622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204507.71733: Set connection var ansible_shell_type to sh 44842 1727204507.71748: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204507.71753: Set connection var ansible_connection to ssh 44842 1727204507.71759: Set connection var ansible_pipelining to False 44842 1727204507.71769: Set connection var ansible_timeout to 10 44842 1727204507.71777: Set connection var ansible_shell_executable to /bin/sh 44842 1727204507.71801: variable 'ansible_shell_executable' from source: unknown 44842 1727204507.71805: variable 'ansible_connection' from source: unknown 44842 1727204507.71810: variable 'ansible_module_compression' from source: unknown 44842 1727204507.71816: variable 'ansible_shell_type' from source: unknown 44842 1727204507.71819: variable 'ansible_shell_executable' from source: unknown 44842 1727204507.71827: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204507.71830: variable 'ansible_pipelining' from source: unknown 44842 1727204507.71934: variable 'ansible_timeout' from source: unknown 44842 1727204507.71938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204507.71996: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204507.72005: variable 'omit' from source: magic vars 44842 1727204507.72014: starting attempt loop 44842 1727204507.72017: running the handler 44842 1727204507.72030: _low_level_execute_command(): starting 44842 1727204507.72035: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204507.73051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.73073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.73084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.73099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.73137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.73152: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204507.73161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.73181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.73189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.73195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.73203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.73212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.73223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.73230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.73237: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.73247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.73338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.73345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.73486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.73575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.75281: stdout chunk (state=3): >>>/root <<< 44842 1727204507.75286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.75289: stderr chunk (state=3): >>><<< 44842 1727204507.75291: stdout chunk (state=3): >>><<< 44842 1727204507.75320: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204507.75333: _low_level_execute_command(): starting 44842 1727204507.75369: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347 `" && echo ansible-tmp-1727204507.7531948-46251-97627604080347="` echo /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347 `" ) && sleep 0' 44842 1727204507.76252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.76261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.76278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.76293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.76336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.76344: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204507.76354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.76374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.76382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.76389: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.76398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.76414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.76426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.76433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.76441: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.76450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.76592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.76641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.76652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.76861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.78686: stdout chunk (state=3): >>>ansible-tmp-1727204507.7531948-46251-97627604080347=/root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347 <<< 44842 1727204507.78869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.78873: stdout chunk (state=3): >>><<< 44842 1727204507.78882: stderr chunk (state=3): >>><<< 44842 1727204507.78902: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204507.7531948-46251-97627604080347=/root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204507.78937: variable 'ansible_module_compression' from source: unknown 44842 1727204507.78995: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204507.79030: variable 'ansible_facts' from source: unknown 44842 1727204507.79114: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347/AnsiballZ_command.py 44842 1727204507.79268: Sending initial data 44842 1727204507.79271: Sent initial data (155 bytes) 44842 1727204507.80285: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.80293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.80306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.80323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.80367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.80371: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204507.80382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.80397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.80402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.80409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.80422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.80432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.80443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.80452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.80457: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.80467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.80537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.80552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.80556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.80646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.82345: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204507.82397: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204507.82453: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpmvawysqb /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347/AnsiballZ_command.py <<< 44842 1727204507.82505: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204507.83683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.83870: stderr chunk (state=3): >>><<< 44842 1727204507.83873: stdout chunk (state=3): >>><<< 44842 1727204507.83875: done transferring module to remote 44842 1727204507.83882: _low_level_execute_command(): starting 44842 1727204507.83966: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347/ /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347/AnsiballZ_command.py && sleep 0' 44842 1727204507.84596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.84611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.84628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.84643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.84687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.84700: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204507.84714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.84735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.84746: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.84757: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.84775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.84790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.84805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.84816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.84825: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.84840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.84918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.84935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.84953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.85072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204507.86831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204507.86835: stdout chunk (state=3): >>><<< 44842 1727204507.86843: stderr chunk (state=3): >>><<< 44842 1727204507.86863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204507.86868: _low_level_execute_command(): starting 44842 1727204507.86870: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347/AnsiballZ_command.py && sleep 0' 44842 1727204507.87499: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204507.87508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.87519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.87533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.87580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.87587: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204507.87597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.87610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204507.87618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204507.87624: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204507.87632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204507.87643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204507.87654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204507.87666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204507.87669: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204507.87680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204507.87766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204507.87774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204507.87778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204507.87878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.01536: stdout chunk (state=3): >>> {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 15:01:48.008833", "end": "2024-09-24 15:01:48.014508", "delta": "0:00:00.005675", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204508.02674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204508.02678: stderr chunk (state=3): >>><<< 44842 1727204508.02681: stdout chunk (state=3): >>><<< 44842 1727204508.02704: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-24 15:01:48.008833", "end": "2024-09-24 15:01:48.014508", "delta": "0:00:00.005675", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204508.02744: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204508.02751: _low_level_execute_command(): starting 44842 1727204508.02757: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204507.7531948-46251-97627604080347/ > /dev/null 2>&1 && sleep 0' 44842 1727204508.04126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.04136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.04145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.04158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.04915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.04922: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.04932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.04946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.04954: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.04961: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.04976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.04986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.04998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.05005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.05013: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.05020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.05103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.05123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.05137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.05221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.07078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.07083: stdout chunk (state=3): >>><<< 44842 1727204508.07089: stderr chunk (state=3): >>><<< 44842 1727204508.07107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.07113: handler run complete 44842 1727204508.07139: Evaluated conditional (False): False 44842 1727204508.07149: attempt loop complete, returning result 44842 1727204508.07152: _execute() done 44842 1727204508.07154: dumping result to json 44842 1727204508.07161: done dumping result, returning 44842 1727204508.07174: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30200 [0affcd87-79f5-aad0-d242-00000000005c] 44842 1727204508.07180: sending task result for task 0affcd87-79f5-aad0-d242-00000000005c 44842 1727204508.07289: done sending task result for task 0affcd87-79f5-aad0-d242-00000000005c 44842 1727204508.07292: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30200" ], "delta": "0:00:00.005675", "end": "2024-09-24 15:01:48.014508", "rc": 0, "start": "2024-09-24 15:01:48.008833" } STDOUT: 30200: from 198.51.100.58/26 lookup 30200 proto static 30201: from all fwmark 0x1/0x1 lookup 30200 proto static 30202: from all ipproto tcp lookup 30200 proto static 30203: from all sport 128-256 lookup 30200 proto static 30204: from all tos throughput lookup 30200 proto static 44842 1727204508.07379: no more pending results, returning what we have 44842 1727204508.07383: results queue empty 44842 1727204508.07384: checking for any_errors_fatal 44842 1727204508.07385: done checking for any_errors_fatal 44842 1727204508.07386: checking for max_fail_percentage 44842 1727204508.07387: done checking for max_fail_percentage 44842 1727204508.07388: checking to see if all hosts have failed and the running result is not ok 44842 1727204508.07389: done checking to see if all hosts have failed 44842 1727204508.07390: getting the remaining hosts for this loop 44842 1727204508.07392: done getting the remaining hosts for this loop 44842 1727204508.07396: getting the next task for host managed-node1 44842 1727204508.07402: done getting next task for host managed-node1 44842 1727204508.07404: ^ task is: TASK: Get the routing rule for looking up the table 30400 44842 1727204508.07406: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204508.07410: getting variables 44842 1727204508.07411: in VariableManager get_vars() 44842 1727204508.07449: Calling all_inventory to load vars for managed-node1 44842 1727204508.07452: Calling groups_inventory to load vars for managed-node1 44842 1727204508.07454: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204508.07465: Calling all_plugins_play to load vars for managed-node1 44842 1727204508.07467: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204508.07470: Calling groups_plugins_play to load vars for managed-node1 44842 1727204508.10213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204508.13660: done with get_vars() 44842 1727204508.13696: done getting variables 44842 1727204508.13787: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30400] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:122 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.440) 0:00:18.306 ***** 44842 1727204508.13823: entering _queue_task() for managed-node1/command 44842 1727204508.14172: worker is 1 (out of 1 available) 44842 1727204508.14186: exiting _queue_task() for managed-node1/command 44842 1727204508.14199: done queuing things up, now waiting for results queue to drain 44842 1727204508.14200: waiting for pending results... 44842 1727204508.14489: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30400 44842 1727204508.14604: in run() - task 0affcd87-79f5-aad0-d242-00000000005d 44842 1727204508.14629: variable 'ansible_search_path' from source: unknown 44842 1727204508.14678: calling self._execute() 44842 1727204508.14795: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.14807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.14822: variable 'omit' from source: magic vars 44842 1727204508.15457: variable 'ansible_distribution_major_version' from source: facts 44842 1727204508.15497: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204508.15815: variable 'ansible_distribution_major_version' from source: facts 44842 1727204508.15829: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204508.15845: variable 'omit' from source: magic vars 44842 1727204508.15874: variable 'omit' from source: magic vars 44842 1727204508.15977: variable 'omit' from source: magic vars 44842 1727204508.16083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204508.16173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204508.16234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204508.16259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204508.16310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204508.16386: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204508.16468: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.16479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.16696: Set connection var ansible_shell_type to sh 44842 1727204508.16716: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204508.16725: Set connection var ansible_connection to ssh 44842 1727204508.16733: Set connection var ansible_pipelining to False 44842 1727204508.16741: Set connection var ansible_timeout to 10 44842 1727204508.16751: Set connection var ansible_shell_executable to /bin/sh 44842 1727204508.16780: variable 'ansible_shell_executable' from source: unknown 44842 1727204508.16793: variable 'ansible_connection' from source: unknown 44842 1727204508.16816: variable 'ansible_module_compression' from source: unknown 44842 1727204508.16822: variable 'ansible_shell_type' from source: unknown 44842 1727204508.16827: variable 'ansible_shell_executable' from source: unknown 44842 1727204508.16903: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.16912: variable 'ansible_pipelining' from source: unknown 44842 1727204508.16923: variable 'ansible_timeout' from source: unknown 44842 1727204508.16930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.17197: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204508.17244: variable 'omit' from source: magic vars 44842 1727204508.17282: starting attempt loop 44842 1727204508.17289: running the handler 44842 1727204508.17311: _low_level_execute_command(): starting 44842 1727204508.17348: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204508.19376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.19407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.19422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.19440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.19488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.19504: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.19518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.19535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.19546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.19556: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.19569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.19583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.19598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.19613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.19624: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.19636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.19715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.19743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.19761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.19854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.21407: stdout chunk (state=3): >>>/root <<< 44842 1727204508.21584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.21618: stderr chunk (state=3): >>><<< 44842 1727204508.21621: stdout chunk (state=3): >>><<< 44842 1727204508.21742: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.21746: _low_level_execute_command(): starting 44842 1727204508.21750: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821 `" && echo ansible-tmp-1727204508.2164416-46280-145690100981821="` echo /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821 `" ) && sleep 0' 44842 1727204508.23272: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.23277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.23312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204508.23316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.23319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.23505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.23566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.23736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.25577: stdout chunk (state=3): >>>ansible-tmp-1727204508.2164416-46280-145690100981821=/root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821 <<< 44842 1727204508.25683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.25766: stderr chunk (state=3): >>><<< 44842 1727204508.25770: stdout chunk (state=3): >>><<< 44842 1727204508.26072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204508.2164416-46280-145690100981821=/root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.26076: variable 'ansible_module_compression' from source: unknown 44842 1727204508.26079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204508.26081: variable 'ansible_facts' from source: unknown 44842 1727204508.26083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821/AnsiballZ_command.py 44842 1727204508.26812: Sending initial data 44842 1727204508.26816: Sent initial data (156 bytes) 44842 1727204508.29143: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.29216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.29239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.29259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.29308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.29345: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.29361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.29384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.29397: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.29434: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.29450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.29463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.29480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.29491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.29500: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.29511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.29609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.29702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.29717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.29799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.31502: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204508.31588: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204508.31992: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpmtjcsnl1 /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821/AnsiballZ_command.py <<< 44842 1727204508.32045: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204508.33404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.33511: stderr chunk (state=3): >>><<< 44842 1727204508.33515: stdout chunk (state=3): >>><<< 44842 1727204508.33557: done transferring module to remote 44842 1727204508.33583: _low_level_execute_command(): starting 44842 1727204508.33589: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821/ /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821/AnsiballZ_command.py && sleep 0' 44842 1727204508.34268: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.34274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.34286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.34298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.34338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.34344: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.34354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.34369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.34376: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.34383: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.34391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.34402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.34410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.34418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.34424: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.34433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.34512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.34530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.34541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.34620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.36388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.36392: stdout chunk (state=3): >>><<< 44842 1727204508.36398: stderr chunk (state=3): >>><<< 44842 1727204508.36420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.36424: _low_level_execute_command(): starting 44842 1727204508.36426: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821/AnsiballZ_command.py && sleep 0' 44842 1727204508.37206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.37210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.37226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.37262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.37288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.37291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.37293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.37358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.37366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.37370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.37436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.50901: stdout chunk (state=3): >>> {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 15:01:48.505065", "end": "2024-09-24 15:01:48.508244", "delta": "0:00:00.003179", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204508.52110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204508.52122: stderr chunk (state=3): >>><<< 44842 1727204508.52125: stdout chunk (state=3): >>><<< 44842 1727204508.52145: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-24 15:01:48.505065", "end": "2024-09-24 15:01:48.508244", "delta": "0:00:00.003179", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204508.52185: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204508.52193: _low_level_execute_command(): starting 44842 1727204508.52198: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204508.2164416-46280-145690100981821/ > /dev/null 2>&1 && sleep 0' 44842 1727204508.52670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.52689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.52701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44842 1727204508.52712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.52763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.52779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.52844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.54613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.54674: stderr chunk (state=3): >>><<< 44842 1727204508.54678: stdout chunk (state=3): >>><<< 44842 1727204508.54695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.54701: handler run complete 44842 1727204508.54724: Evaluated conditional (False): False 44842 1727204508.54732: attempt loop complete, returning result 44842 1727204508.54735: _execute() done 44842 1727204508.54737: dumping result to json 44842 1727204508.54742: done dumping result, returning 44842 1727204508.54750: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30400 [0affcd87-79f5-aad0-d242-00000000005d] 44842 1727204508.54755: sending task result for task 0affcd87-79f5-aad0-d242-00000000005d 44842 1727204508.54857: done sending task result for task 0affcd87-79f5-aad0-d242-00000000005d 44842 1727204508.54862: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30400" ], "delta": "0:00:00.003179", "end": "2024-09-24 15:01:48.508244", "rc": 0, "start": "2024-09-24 15:01:48.505065" } STDOUT: 30400: from all to 198.51.100.128/26 lookup 30400 proto static 30401: from all iif iiftest [detached] lookup 30400 proto static 30402: from all oif oiftest [detached] lookup 30400 proto static 30403: from all lookup 30400 proto static 44842 1727204508.54954: no more pending results, returning what we have 44842 1727204508.54958: results queue empty 44842 1727204508.54959: checking for any_errors_fatal 44842 1727204508.54971: done checking for any_errors_fatal 44842 1727204508.54972: checking for max_fail_percentage 44842 1727204508.54973: done checking for max_fail_percentage 44842 1727204508.54974: checking to see if all hosts have failed and the running result is not ok 44842 1727204508.54976: done checking to see if all hosts have failed 44842 1727204508.54976: getting the remaining hosts for this loop 44842 1727204508.54978: done getting the remaining hosts for this loop 44842 1727204508.54982: getting the next task for host managed-node1 44842 1727204508.54988: done getting next task for host managed-node1 44842 1727204508.54991: ^ task is: TASK: Get the routing rule for looking up the table 30600 44842 1727204508.54993: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204508.54996: getting variables 44842 1727204508.54998: in VariableManager get_vars() 44842 1727204508.55033: Calling all_inventory to load vars for managed-node1 44842 1727204508.55035: Calling groups_inventory to load vars for managed-node1 44842 1727204508.55037: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204508.55048: Calling all_plugins_play to load vars for managed-node1 44842 1727204508.55050: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204508.55052: Calling groups_plugins_play to load vars for managed-node1 44842 1727204508.55897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204508.56962: done with get_vars() 44842 1727204508.56985: done getting variables 44842 1727204508.57031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30600] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:129 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.432) 0:00:18.738 ***** 44842 1727204508.57052: entering _queue_task() for managed-node1/command 44842 1727204508.57304: worker is 1 (out of 1 available) 44842 1727204508.57318: exiting _queue_task() for managed-node1/command 44842 1727204508.57331: done queuing things up, now waiting for results queue to drain 44842 1727204508.57333: waiting for pending results... 44842 1727204508.57522: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30600 44842 1727204508.57591: in run() - task 0affcd87-79f5-aad0-d242-00000000005e 44842 1727204508.57603: variable 'ansible_search_path' from source: unknown 44842 1727204508.57632: calling self._execute() 44842 1727204508.57711: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.57715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.57724: variable 'omit' from source: magic vars 44842 1727204508.58007: variable 'ansible_distribution_major_version' from source: facts 44842 1727204508.58018: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204508.58103: variable 'ansible_distribution_major_version' from source: facts 44842 1727204508.58106: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204508.58114: variable 'omit' from source: magic vars 44842 1727204508.58130: variable 'omit' from source: magic vars 44842 1727204508.58372: variable 'omit' from source: magic vars 44842 1727204508.58375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204508.58378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204508.58381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204508.58383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204508.58385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204508.58388: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204508.58390: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.58392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.58481: Set connection var ansible_shell_type to sh 44842 1727204508.58485: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204508.58487: Set connection var ansible_connection to ssh 44842 1727204508.58489: Set connection var ansible_pipelining to False 44842 1727204508.58491: Set connection var ansible_timeout to 10 44842 1727204508.58493: Set connection var ansible_shell_executable to /bin/sh 44842 1727204508.58495: variable 'ansible_shell_executable' from source: unknown 44842 1727204508.58497: variable 'ansible_connection' from source: unknown 44842 1727204508.58499: variable 'ansible_module_compression' from source: unknown 44842 1727204508.58501: variable 'ansible_shell_type' from source: unknown 44842 1727204508.58503: variable 'ansible_shell_executable' from source: unknown 44842 1727204508.58505: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.58507: variable 'ansible_pipelining' from source: unknown 44842 1727204508.58509: variable 'ansible_timeout' from source: unknown 44842 1727204508.58511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.58597: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204508.58607: variable 'omit' from source: magic vars 44842 1727204508.58612: starting attempt loop 44842 1727204508.58615: running the handler 44842 1727204508.58629: _low_level_execute_command(): starting 44842 1727204508.58637: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204508.59320: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.59331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.59341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.59356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.59397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.59403: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.59412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.59424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.59432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.59438: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.59445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.59455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.59467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.59475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.59482: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.59491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.59568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.59583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.59594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.59682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.61206: stdout chunk (state=3): >>>/root <<< 44842 1727204508.61313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.61370: stderr chunk (state=3): >>><<< 44842 1727204508.61373: stdout chunk (state=3): >>><<< 44842 1727204508.61394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.61406: _low_level_execute_command(): starting 44842 1727204508.61413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342 `" && echo ansible-tmp-1727204508.613939-46350-194342934474342="` echo /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342 `" ) && sleep 0' 44842 1727204508.61870: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.61876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.61920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.61924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.61927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.61989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.61993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.61995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.62055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.63883: stdout chunk (state=3): >>>ansible-tmp-1727204508.613939-46350-194342934474342=/root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342 <<< 44842 1727204508.63998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.64057: stderr chunk (state=3): >>><<< 44842 1727204508.64063: stdout chunk (state=3): >>><<< 44842 1727204508.64079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204508.613939-46350-194342934474342=/root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.64105: variable 'ansible_module_compression' from source: unknown 44842 1727204508.64153: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204508.64183: variable 'ansible_facts' from source: unknown 44842 1727204508.64247: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342/AnsiballZ_command.py 44842 1727204508.64368: Sending initial data 44842 1727204508.64371: Sent initial data (155 bytes) 44842 1727204508.65349: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.65354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.65404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.65410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204508.65415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.65435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.65441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.65522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.65526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.65611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.67294: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204508.67348: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204508.67412: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp7d6wq87j /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342/AnsiballZ_command.py <<< 44842 1727204508.67484: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204508.68795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.68947: stderr chunk (state=3): >>><<< 44842 1727204508.68950: stdout chunk (state=3): >>><<< 44842 1727204508.68978: done transferring module to remote 44842 1727204508.68989: _low_level_execute_command(): starting 44842 1727204508.68994: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342/ /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342/AnsiballZ_command.py && sleep 0' 44842 1727204508.69639: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.69648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.69659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.69679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.69718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.69725: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.69735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.69768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.69776: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.69779: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.69781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.69789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.69801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.69806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.69814: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.69823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.69901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.69918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.69929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.70015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.71780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.71828: stderr chunk (state=3): >>><<< 44842 1727204508.71831: stdout chunk (state=3): >>><<< 44842 1727204508.71924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.71928: _low_level_execute_command(): starting 44842 1727204508.71931: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342/AnsiballZ_command.py && sleep 0' 44842 1727204508.72509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.72522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.72536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.72552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.72599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.72611: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.72623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.72640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.72650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.72660: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.72674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.72687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.72705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.72716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.72725: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.72737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.72819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.72840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.72856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.72944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.86079: stdout chunk (state=3): >>> {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 15:01:48.856859", "end": "2024-09-24 15:01:48.859964", "delta": "0:00:00.003105", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204508.87186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204508.87227: stderr chunk (state=3): >>><<< 44842 1727204508.87231: stdout chunk (state=3): >>><<< 44842 1727204508.87255: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-24 15:01:48.856859", "end": "2024-09-24 15:01:48.859964", "delta": "0:00:00.003105", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204508.87300: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 rule list table 30600', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204508.87308: _low_level_execute_command(): starting 44842 1727204508.87313: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204508.613939-46350-194342934474342/ > /dev/null 2>&1 && sleep 0' 44842 1727204508.88906: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204508.89040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.89052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.89072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.89113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.89141: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204508.89152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.89257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204508.89271: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204508.89280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204508.89289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204508.89298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204508.89314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204508.89331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204508.89334: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204508.89339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204508.89419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204508.89482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204508.89496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204508.89604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204508.91390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204508.91394: stdout chunk (state=3): >>><<< 44842 1727204508.91400: stderr chunk (state=3): >>><<< 44842 1727204508.91423: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204508.91429: handler run complete 44842 1727204508.91453: Evaluated conditional (False): False 44842 1727204508.91467: attempt loop complete, returning result 44842 1727204508.91470: _execute() done 44842 1727204508.91473: dumping result to json 44842 1727204508.91479: done dumping result, returning 44842 1727204508.91487: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 30600 [0affcd87-79f5-aad0-d242-00000000005e] 44842 1727204508.91493: sending task result for task 0affcd87-79f5-aad0-d242-00000000005e 44842 1727204508.91598: done sending task result for task 0affcd87-79f5-aad0-d242-00000000005e 44842 1727204508.91601: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "-6", "rule", "list", "table", "30600" ], "delta": "0:00:00.003105", "end": "2024-09-24 15:01:48.859964", "rc": 0, "start": "2024-09-24 15:01:48.856859" } STDOUT: 30600: from all to 2001:db8::4/32 lookup 30600 proto static 30601: not from all dport 128-256 lookup 30600 proto static 30602: from all lookup 30600 proto static 44842 1727204508.91672: no more pending results, returning what we have 44842 1727204508.91676: results queue empty 44842 1727204508.91678: checking for any_errors_fatal 44842 1727204508.91686: done checking for any_errors_fatal 44842 1727204508.91687: checking for max_fail_percentage 44842 1727204508.91689: done checking for max_fail_percentage 44842 1727204508.91690: checking to see if all hosts have failed and the running result is not ok 44842 1727204508.91690: done checking to see if all hosts have failed 44842 1727204508.91691: getting the remaining hosts for this loop 44842 1727204508.91693: done getting the remaining hosts for this loop 44842 1727204508.91697: getting the next task for host managed-node1 44842 1727204508.91703: done getting next task for host managed-node1 44842 1727204508.91706: ^ task is: TASK: Get the routing rule for looking up the table 'custom' 44842 1727204508.91708: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204508.91711: getting variables 44842 1727204508.91713: in VariableManager get_vars() 44842 1727204508.91750: Calling all_inventory to load vars for managed-node1 44842 1727204508.91753: Calling groups_inventory to load vars for managed-node1 44842 1727204508.91755: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204508.91767: Calling all_plugins_play to load vars for managed-node1 44842 1727204508.91769: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204508.91772: Calling groups_plugins_play to load vars for managed-node1 44842 1727204508.94751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204508.97056: done with get_vars() 44842 1727204508.97093: done getting variables 44842 1727204508.97158: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 'custom'] ****************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:136 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.401) 0:00:19.140 ***** 44842 1727204508.97199: entering _queue_task() for managed-node1/command 44842 1727204508.97560: worker is 1 (out of 1 available) 44842 1727204508.97575: exiting _queue_task() for managed-node1/command 44842 1727204508.97589: done queuing things up, now waiting for results queue to drain 44842 1727204508.97590: waiting for pending results... 44842 1727204508.97890: running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 'custom' 44842 1727204508.97991: in run() - task 0affcd87-79f5-aad0-d242-00000000005f 44842 1727204508.98014: variable 'ansible_search_path' from source: unknown 44842 1727204508.98060: calling self._execute() 44842 1727204508.98166: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.98179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.98194: variable 'omit' from source: magic vars 44842 1727204508.98584: variable 'ansible_distribution_major_version' from source: facts 44842 1727204508.98603: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204508.98728: variable 'ansible_distribution_major_version' from source: facts 44842 1727204508.98739: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204508.98751: variable 'omit' from source: magic vars 44842 1727204508.98778: variable 'omit' from source: magic vars 44842 1727204508.98824: variable 'omit' from source: magic vars 44842 1727204508.98877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204508.98925: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204508.98953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204508.98978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204508.98995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204508.99034: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204508.99044: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.99052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.99157: Set connection var ansible_shell_type to sh 44842 1727204508.99176: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204508.99187: Set connection var ansible_connection to ssh 44842 1727204508.99198: Set connection var ansible_pipelining to False 44842 1727204508.99208: Set connection var ansible_timeout to 10 44842 1727204508.99219: Set connection var ansible_shell_executable to /bin/sh 44842 1727204508.99252: variable 'ansible_shell_executable' from source: unknown 44842 1727204508.99260: variable 'ansible_connection' from source: unknown 44842 1727204508.99269: variable 'ansible_module_compression' from source: unknown 44842 1727204508.99276: variable 'ansible_shell_type' from source: unknown 44842 1727204508.99283: variable 'ansible_shell_executable' from source: unknown 44842 1727204508.99290: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204508.99298: variable 'ansible_pipelining' from source: unknown 44842 1727204508.99305: variable 'ansible_timeout' from source: unknown 44842 1727204508.99317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204508.99573: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204508.99590: variable 'omit' from source: magic vars 44842 1727204508.99599: starting attempt loop 44842 1727204508.99606: running the handler 44842 1727204508.99624: _low_level_execute_command(): starting 44842 1727204508.99641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204509.00405: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.00422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.00439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.00460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.00513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.00528: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.00574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.00594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.00606: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.00619: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.00636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.00651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.00670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.00684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.00696: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.00711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.00795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.00819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.00836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.00928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.02443: stdout chunk (state=3): >>>/root <<< 44842 1727204509.02634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.02637: stdout chunk (state=3): >>><<< 44842 1727204509.02639: stderr chunk (state=3): >>><<< 44842 1727204509.02751: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.02755: _low_level_execute_command(): starting 44842 1727204509.02758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740 `" && echo ansible-tmp-1727204509.0265825-46405-8550598956740="` echo /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740 `" ) && sleep 0' 44842 1727204509.03322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.03335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.03350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.03372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.03421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.03434: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.03448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.03469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.03482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.03494: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.03514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.03529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.03545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.03558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.03571: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.03585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.03668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.03691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.03707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.03796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.05620: stdout chunk (state=3): >>>ansible-tmp-1727204509.0265825-46405-8550598956740=/root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740 <<< 44842 1727204509.05736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.05819: stderr chunk (state=3): >>><<< 44842 1727204509.05822: stdout chunk (state=3): >>><<< 44842 1727204509.06074: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204509.0265825-46405-8550598956740=/root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.06077: variable 'ansible_module_compression' from source: unknown 44842 1727204509.06080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204509.06084: variable 'ansible_facts' from source: unknown 44842 1727204509.06086: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740/AnsiballZ_command.py 44842 1727204509.06414: Sending initial data 44842 1727204509.06417: Sent initial data (154 bytes) 44842 1727204509.07459: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.07477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.07493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.07512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.07565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.07579: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.07593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.07610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.07622: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.07632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.07647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.07667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.07685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.07697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.07708: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.07722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.07805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.07828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.07843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.07994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.09619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204509.09669: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204509.09727: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpfbox2dgf /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740/AnsiballZ_command.py <<< 44842 1727204509.09799: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204509.11241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.11478: stderr chunk (state=3): >>><<< 44842 1727204509.11481: stdout chunk (state=3): >>><<< 44842 1727204509.11484: done transferring module to remote 44842 1727204509.11486: _low_level_execute_command(): starting 44842 1727204509.11492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740/ /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740/AnsiballZ_command.py && sleep 0' 44842 1727204509.13083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.13099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.13140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.13175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.13230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.13254: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.13270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.13288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.13301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.13313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.13325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.13352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.13382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.13411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.13431: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.13453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.13537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.13578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.13623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.13708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.15497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.15500: stdout chunk (state=3): >>><<< 44842 1727204509.15503: stderr chunk (state=3): >>><<< 44842 1727204509.15616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.15620: _low_level_execute_command(): starting 44842 1727204509.15624: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740/AnsiballZ_command.py && sleep 0' 44842 1727204509.16281: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.16301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.16315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.16333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.16379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.16393: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.16412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.16429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.16443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.16453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.16471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.16488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.16505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.16522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.16533: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.16547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.16646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.16674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.16691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.16788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.30074: stdout chunk (state=3): >>> {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 15:01:49.296667", "end": "2024-09-24 15:01:49.299879", "delta": "0:00:00.003212", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204509.31300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204509.31304: stdout chunk (state=3): >>><<< 44842 1727204509.31307: stderr chunk (state=3): >>><<< 44842 1727204509.31462: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-24 15:01:49.296667", "end": "2024-09-24 15:01:49.299879", "delta": "0:00:00.003212", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204509.31475: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204509.31478: _low_level_execute_command(): starting 44842 1727204509.31480: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204509.0265825-46405-8550598956740/ > /dev/null 2>&1 && sleep 0' 44842 1727204509.32537: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.32541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.32581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.32585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204509.32589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.32639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.33587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.33590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.33671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.35429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.35510: stderr chunk (state=3): >>><<< 44842 1727204509.35514: stdout chunk (state=3): >>><<< 44842 1727204509.35773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.35777: handler run complete 44842 1727204509.35780: Evaluated conditional (False): False 44842 1727204509.35782: attempt loop complete, returning result 44842 1727204509.35784: _execute() done 44842 1727204509.35786: dumping result to json 44842 1727204509.35788: done dumping result, returning 44842 1727204509.35790: done running TaskExecutor() for managed-node1/TASK: Get the routing rule for looking up the table 'custom' [0affcd87-79f5-aad0-d242-00000000005f] 44842 1727204509.35792: sending task result for task 0affcd87-79f5-aad0-d242-00000000005f 44842 1727204509.35873: done sending task result for task 0affcd87-79f5-aad0-d242-00000000005f 44842 1727204509.35878: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "custom" ], "delta": "0:00:00.003212", "end": "2024-09-24 15:01:49.299879", "rc": 0, "start": "2024-09-24 15:01:49.296667" } STDOUT: 200: from 198.51.100.56/26 lookup custom proto static 44842 1727204509.35965: no more pending results, returning what we have 44842 1727204509.35970: results queue empty 44842 1727204509.35971: checking for any_errors_fatal 44842 1727204509.35978: done checking for any_errors_fatal 44842 1727204509.35979: checking for max_fail_percentage 44842 1727204509.35981: done checking for max_fail_percentage 44842 1727204509.35982: checking to see if all hosts have failed and the running result is not ok 44842 1727204509.35983: done checking to see if all hosts have failed 44842 1727204509.35984: getting the remaining hosts for this loop 44842 1727204509.35986: done getting the remaining hosts for this loop 44842 1727204509.35991: getting the next task for host managed-node1 44842 1727204509.36000: done getting next task for host managed-node1 44842 1727204509.36002: ^ task is: TASK: Get the IPv4 routing rule for the connection "{{ interface }}" 44842 1727204509.36005: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204509.36009: getting variables 44842 1727204509.36011: in VariableManager get_vars() 44842 1727204509.36052: Calling all_inventory to load vars for managed-node1 44842 1727204509.36055: Calling groups_inventory to load vars for managed-node1 44842 1727204509.36058: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204509.36073: Calling all_plugins_play to load vars for managed-node1 44842 1727204509.36077: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204509.36080: Calling groups_plugins_play to load vars for managed-node1 44842 1727204509.47673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204509.49410: done with get_vars() 44842 1727204509.49439: done getting variables 44842 1727204509.49500: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204509.49612: variable 'interface' from source: set_fact TASK [Get the IPv4 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:143 Tuesday 24 September 2024 15:01:49 -0400 (0:00:00.524) 0:00:19.664 ***** 44842 1727204509.49637: entering _queue_task() for managed-node1/command 44842 1727204509.49996: worker is 1 (out of 1 available) 44842 1727204509.50009: exiting _queue_task() for managed-node1/command 44842 1727204509.50021: done queuing things up, now waiting for results queue to drain 44842 1727204509.50022: waiting for pending results... 44842 1727204509.50391: running TaskExecutor() for managed-node1/TASK: Get the IPv4 routing rule for the connection "ethtest0" 44842 1727204509.50507: in run() - task 0affcd87-79f5-aad0-d242-000000000060 44842 1727204509.50525: variable 'ansible_search_path' from source: unknown 44842 1727204509.50577: calling self._execute() 44842 1727204509.50689: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204509.50701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204509.50717: variable 'omit' from source: magic vars 44842 1727204509.51132: variable 'ansible_distribution_major_version' from source: facts 44842 1727204509.51151: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204509.51168: variable 'omit' from source: magic vars 44842 1727204509.51194: variable 'omit' from source: magic vars 44842 1727204509.51313: variable 'interface' from source: set_fact 44842 1727204509.51339: variable 'omit' from source: magic vars 44842 1727204509.51391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204509.51435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204509.51468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204509.51490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204509.51506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204509.51545: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204509.51555: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204509.51567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204509.51672: Set connection var ansible_shell_type to sh 44842 1727204509.51687: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204509.51696: Set connection var ansible_connection to ssh 44842 1727204509.51703: Set connection var ansible_pipelining to False 44842 1727204509.51711: Set connection var ansible_timeout to 10 44842 1727204509.51720: Set connection var ansible_shell_executable to /bin/sh 44842 1727204509.51749: variable 'ansible_shell_executable' from source: unknown 44842 1727204509.51758: variable 'ansible_connection' from source: unknown 44842 1727204509.51774: variable 'ansible_module_compression' from source: unknown 44842 1727204509.51780: variable 'ansible_shell_type' from source: unknown 44842 1727204509.51786: variable 'ansible_shell_executable' from source: unknown 44842 1727204509.51791: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204509.51798: variable 'ansible_pipelining' from source: unknown 44842 1727204509.51803: variable 'ansible_timeout' from source: unknown 44842 1727204509.51809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204509.51951: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204509.51977: variable 'omit' from source: magic vars 44842 1727204509.51990: starting attempt loop 44842 1727204509.51997: running the handler 44842 1727204509.52016: _low_level_execute_command(): starting 44842 1727204509.52029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204509.52846: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.52870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.52888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.52907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.52953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.52978: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.52995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.53014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.53028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.53039: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.53052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.53081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.53102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.53116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.53128: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.53142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.53229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.53247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.53267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.53412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.54924: stdout chunk (state=3): >>>/root <<< 44842 1727204509.55121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.55125: stdout chunk (state=3): >>><<< 44842 1727204509.55128: stderr chunk (state=3): >>><<< 44842 1727204509.55255: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.55262: _low_level_execute_command(): starting 44842 1727204509.55269: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954 `" && echo ansible-tmp-1727204509.5515041-46478-128777772636954="` echo /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954 `" ) && sleep 0' 44842 1727204509.57143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.57147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.57187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204509.57191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.57194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.57262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.57271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.57274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.57337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.59188: stdout chunk (state=3): >>>ansible-tmp-1727204509.5515041-46478-128777772636954=/root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954 <<< 44842 1727204509.59303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.59384: stderr chunk (state=3): >>><<< 44842 1727204509.59387: stdout chunk (state=3): >>><<< 44842 1727204509.59673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204509.5515041-46478-128777772636954=/root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.59677: variable 'ansible_module_compression' from source: unknown 44842 1727204509.59680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204509.59682: variable 'ansible_facts' from source: unknown 44842 1727204509.59684: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954/AnsiballZ_command.py 44842 1727204509.60226: Sending initial data 44842 1727204509.60230: Sent initial data (156 bytes) 44842 1727204509.62256: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.62317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.62334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.62354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.62405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.62532: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.62548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.62572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.62585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.62598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.62611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.62630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.62647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.62666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.62679: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.62694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.62783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.62873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.62892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.63084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.64707: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204509.64755: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204509.64809: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp7u6ldfvw /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954/AnsiballZ_command.py <<< 44842 1727204509.64858: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204509.66511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.66682: stderr chunk (state=3): >>><<< 44842 1727204509.66686: stdout chunk (state=3): >>><<< 44842 1727204509.66688: done transferring module to remote 44842 1727204509.66690: _low_level_execute_command(): starting 44842 1727204509.66693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954/ /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954/AnsiballZ_command.py && sleep 0' 44842 1727204509.68082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.68140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.68155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.68183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.68271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.68284: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.68298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.68347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.68367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.68380: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.68392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.68404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.68419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.68431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.68446: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.68461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.68537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.68681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.68702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.68811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.70603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.70606: stdout chunk (state=3): >>><<< 44842 1727204509.70609: stderr chunk (state=3): >>><<< 44842 1727204509.70712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.70716: _low_level_execute_command(): starting 44842 1727204509.70719: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954/AnsiballZ_command.py && sleep 0' 44842 1727204509.73139: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204509.73153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.73174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.73191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.73234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.73246: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204509.73258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.73289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204509.73300: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204509.73310: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204509.73322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204509.73339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.73353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.73776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204509.73789: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204509.73803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.74000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.74017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.74032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.74130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.88884: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:01:49.870050", "end": "2024-09-24 15:01:49.887978", "delta": "0:00:00.017928", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204509.90069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204509.90073: stdout chunk (state=3): >>><<< 44842 1727204509.90076: stderr chunk (state=3): >>><<< 44842 1727204509.90097: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:01:49.870050", "end": "2024-09-24 15:01:49.887978", "delta": "0:00:00.017928", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204509.90137: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv4.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204509.90145: _low_level_execute_command(): starting 44842 1727204509.90148: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204509.5515041-46478-128777772636954/ > /dev/null 2>&1 && sleep 0' 44842 1727204509.91830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.91835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204509.92006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.92010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204509.92027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204509.92031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204509.92220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204509.92235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204509.92245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204509.92329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204509.94207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204509.94211: stdout chunk (state=3): >>><<< 44842 1727204509.94213: stderr chunk (state=3): >>><<< 44842 1727204509.94474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204509.94478: handler run complete 44842 1727204509.94480: Evaluated conditional (False): False 44842 1727204509.94483: attempt loop complete, returning result 44842 1727204509.94485: _execute() done 44842 1727204509.94486: dumping result to json 44842 1727204509.94488: done dumping result, returning 44842 1727204509.94490: done running TaskExecutor() for managed-node1/TASK: Get the IPv4 routing rule for the connection "ethtest0" [0affcd87-79f5-aad0-d242-000000000060] 44842 1727204509.94492: sending task result for task 0affcd87-79f5-aad0-d242-000000000060 44842 1727204509.94576: done sending task result for task 0affcd87-79f5-aad0-d242-000000000060 44842 1727204509.94580: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.017928", "end": "2024-09-24 15:01:49.887978", "rc": 0, "start": "2024-09-24 15:01:49.870050" } STDOUT: ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200 44842 1727204509.94659: no more pending results, returning what we have 44842 1727204509.94668: results queue empty 44842 1727204509.94669: checking for any_errors_fatal 44842 1727204509.94676: done checking for any_errors_fatal 44842 1727204509.94677: checking for max_fail_percentage 44842 1727204509.94679: done checking for max_fail_percentage 44842 1727204509.94680: checking to see if all hosts have failed and the running result is not ok 44842 1727204509.94681: done checking to see if all hosts have failed 44842 1727204509.94681: getting the remaining hosts for this loop 44842 1727204509.94683: done getting the remaining hosts for this loop 44842 1727204509.94688: getting the next task for host managed-node1 44842 1727204509.94696: done getting next task for host managed-node1 44842 1727204509.94699: ^ task is: TASK: Get the IPv6 routing rule for the connection "{{ interface }}" 44842 1727204509.94701: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204509.94705: getting variables 44842 1727204509.94707: in VariableManager get_vars() 44842 1727204509.94747: Calling all_inventory to load vars for managed-node1 44842 1727204509.94750: Calling groups_inventory to load vars for managed-node1 44842 1727204509.94752: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204509.94768: Calling all_plugins_play to load vars for managed-node1 44842 1727204509.94771: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204509.94775: Calling groups_plugins_play to load vars for managed-node1 44842 1727204509.97573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204510.01308: done with get_vars() 44842 1727204510.01341: done getting variables 44842 1727204510.01525: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204510.01705: variable 'interface' from source: set_fact TASK [Get the IPv6 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:149 Tuesday 24 September 2024 15:01:50 -0400 (0:00:00.522) 0:00:20.186 ***** 44842 1727204510.01854: entering _queue_task() for managed-node1/command 44842 1727204510.02541: worker is 1 (out of 1 available) 44842 1727204510.02555: exiting _queue_task() for managed-node1/command 44842 1727204510.02571: done queuing things up, now waiting for results queue to drain 44842 1727204510.02572: waiting for pending results... 44842 1727204510.03558: running TaskExecutor() for managed-node1/TASK: Get the IPv6 routing rule for the connection "ethtest0" 44842 1727204510.03680: in run() - task 0affcd87-79f5-aad0-d242-000000000061 44842 1727204510.03818: variable 'ansible_search_path' from source: unknown 44842 1727204510.03868: calling self._execute() 44842 1727204510.04122: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.04138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.04253: variable 'omit' from source: magic vars 44842 1727204510.04996: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.05130: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204510.05142: variable 'omit' from source: magic vars 44842 1727204510.05175: variable 'omit' from source: magic vars 44842 1727204510.05403: variable 'interface' from source: set_fact 44842 1727204510.05425: variable 'omit' from source: magic vars 44842 1727204510.05486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204510.05591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204510.05687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204510.05710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.05788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.05825: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204510.05834: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.05890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.06078: Set connection var ansible_shell_type to sh 44842 1727204510.06156: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204510.06174: Set connection var ansible_connection to ssh 44842 1727204510.06184: Set connection var ansible_pipelining to False 44842 1727204510.06194: Set connection var ansible_timeout to 10 44842 1727204510.06213: Set connection var ansible_shell_executable to /bin/sh 44842 1727204510.06242: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.06251: variable 'ansible_connection' from source: unknown 44842 1727204510.06264: variable 'ansible_module_compression' from source: unknown 44842 1727204510.06277: variable 'ansible_shell_type' from source: unknown 44842 1727204510.06285: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.06293: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.06303: variable 'ansible_pipelining' from source: unknown 44842 1727204510.06313: variable 'ansible_timeout' from source: unknown 44842 1727204510.06327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.06489: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204510.06507: variable 'omit' from source: magic vars 44842 1727204510.06518: starting attempt loop 44842 1727204510.06525: running the handler 44842 1727204510.06557: _low_level_execute_command(): starting 44842 1727204510.06576: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204510.07430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204510.07446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.07468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.07488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.07543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.07556: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204510.07580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.07599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204510.07612: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204510.07628: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204510.07642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.07667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.07686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.07700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.07712: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204510.07727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.07817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204510.07847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204510.07876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204510.07987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204510.09544: stdout chunk (state=3): >>>/root <<< 44842 1727204510.09647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204510.09743: stderr chunk (state=3): >>><<< 44842 1727204510.09746: stdout chunk (state=3): >>><<< 44842 1727204510.09884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204510.09888: _low_level_execute_command(): starting 44842 1727204510.09892: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022 `" && echo ansible-tmp-1727204510.0977466-46493-227583022912022="` echo /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022 `" ) && sleep 0' 44842 1727204510.11494: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204510.11644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.11666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.11687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.11741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.11756: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204510.11777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.11797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204510.11809: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204510.11821: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204510.11833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.11855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.11879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.11892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.11980: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204510.11995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.12081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204510.12206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204510.12222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204510.12420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204510.14220: stdout chunk (state=3): >>>ansible-tmp-1727204510.0977466-46493-227583022912022=/root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022 <<< 44842 1727204510.14403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204510.14407: stderr chunk (state=3): >>><<< 44842 1727204510.14416: stdout chunk (state=3): >>><<< 44842 1727204510.14436: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204510.0977466-46493-227583022912022=/root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204510.14475: variable 'ansible_module_compression' from source: unknown 44842 1727204510.14531: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204510.14572: variable 'ansible_facts' from source: unknown 44842 1727204510.14667: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022/AnsiballZ_command.py 44842 1727204510.15234: Sending initial data 44842 1727204510.15237: Sent initial data (156 bytes) 44842 1727204510.17769: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.17774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.17952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.17957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.17980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204510.17984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.18180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204510.18199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204510.18353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204510.20142: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204510.20196: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204510.20250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpegj8vvhi /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022/AnsiballZ_command.py <<< 44842 1727204510.20308: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204510.21780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204510.21865: stderr chunk (state=3): >>><<< 44842 1727204510.21869: stdout chunk (state=3): >>><<< 44842 1727204510.21888: done transferring module to remote 44842 1727204510.21899: _low_level_execute_command(): starting 44842 1727204510.21904: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022/ /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022/AnsiballZ_command.py && sleep 0' 44842 1727204510.24936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.24942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.25212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.25216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.25300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204510.25304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.25385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204510.25393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204510.25626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204510.25729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204510.27535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204510.27539: stderr chunk (state=3): >>><<< 44842 1727204510.27542: stdout chunk (state=3): >>><<< 44842 1727204510.27565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204510.27570: _low_level_execute_command(): starting 44842 1727204510.27572: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022/AnsiballZ_command.py && sleep 0' 44842 1727204510.29535: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204510.29544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.29555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.29572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.29618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.29708: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204510.29718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.29732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204510.29740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204510.29747: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204510.29754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.29767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.29778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.29786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.29793: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204510.29802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.29878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204510.29935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204510.29948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204510.30045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204510.44947: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:01:50.430811", "end": "2024-09-24 15:01:50.448651", "delta": "0:00:00.017840", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204510.46086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204510.46149: stderr chunk (state=3): >>><<< 44842 1727204510.46153: stdout chunk (state=3): >>><<< 44842 1727204510.46176: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-24 15:01:50.430811", "end": "2024-09-24 15:01:50.448651", "delta": "0:00:00.017840", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204510.46216: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv6.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204510.46225: _low_level_execute_command(): starting 44842 1727204510.46230: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204510.0977466-46493-227583022912022/ > /dev/null 2>&1 && sleep 0' 44842 1727204510.48205: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204510.48215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.48226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.48241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.48301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.48337: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204510.48347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.48365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204510.48384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204510.48392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204510.48400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204510.48421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204510.48485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204510.48497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204510.48504: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204510.48514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204510.48709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204510.48730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204510.48742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204510.48830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204510.50669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204510.50673: stdout chunk (state=3): >>><<< 44842 1727204510.50679: stderr chunk (state=3): >>><<< 44842 1727204510.50700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204510.50706: handler run complete 44842 1727204510.50732: Evaluated conditional (False): False 44842 1727204510.50742: attempt loop complete, returning result 44842 1727204510.50745: _execute() done 44842 1727204510.50747: dumping result to json 44842 1727204510.50753: done dumping result, returning 44842 1727204510.50765: done running TaskExecutor() for managed-node1/TASK: Get the IPv6 routing rule for the connection "ethtest0" [0affcd87-79f5-aad0-d242-000000000061] 44842 1727204510.50768: sending task result for task 0affcd87-79f5-aad0-d242-000000000061 44842 1727204510.50884: done sending task result for task 0affcd87-79f5-aad0-d242-000000000061 44842 1727204510.50886: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.017840", "end": "2024-09-24 15:01:50.448651", "rc": 0, "start": "2024-09-24 15:01:50.430811" } STDOUT: ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600 44842 1727204510.50956: no more pending results, returning what we have 44842 1727204510.50962: results queue empty 44842 1727204510.50963: checking for any_errors_fatal 44842 1727204510.50972: done checking for any_errors_fatal 44842 1727204510.50972: checking for max_fail_percentage 44842 1727204510.50974: done checking for max_fail_percentage 44842 1727204510.50975: checking to see if all hosts have failed and the running result is not ok 44842 1727204510.50976: done checking to see if all hosts have failed 44842 1727204510.50977: getting the remaining hosts for this loop 44842 1727204510.50979: done getting the remaining hosts for this loop 44842 1727204510.50983: getting the next task for host managed-node1 44842 1727204510.50989: done getting next task for host managed-node1 44842 1727204510.50992: ^ task is: TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 44842 1727204510.50994: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204510.50998: getting variables 44842 1727204510.50999: in VariableManager get_vars() 44842 1727204510.51036: Calling all_inventory to load vars for managed-node1 44842 1727204510.51039: Calling groups_inventory to load vars for managed-node1 44842 1727204510.51041: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204510.51050: Calling all_plugins_play to load vars for managed-node1 44842 1727204510.51053: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204510.51055: Calling groups_plugins_play to load vars for managed-node1 44842 1727204510.54306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204510.56591: done with get_vars() 44842 1727204510.56620: done getting variables 44842 1727204510.56683: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30200 matches the specified rule] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:155 Tuesday 24 September 2024 15:01:50 -0400 (0:00:00.548) 0:00:20.735 ***** 44842 1727204510.56713: entering _queue_task() for managed-node1/assert 44842 1727204510.57123: worker is 1 (out of 1 available) 44842 1727204510.57136: exiting _queue_task() for managed-node1/assert 44842 1727204510.57149: done queuing things up, now waiting for results queue to drain 44842 1727204510.57150: waiting for pending results... 44842 1727204510.58063: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 44842 1727204510.58195: in run() - task 0affcd87-79f5-aad0-d242-000000000062 44842 1727204510.58216: variable 'ansible_search_path' from source: unknown 44842 1727204510.58265: calling self._execute() 44842 1727204510.58441: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.58453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.58479: variable 'omit' from source: magic vars 44842 1727204510.59040: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.59070: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204510.59256: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.59275: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204510.59287: variable 'omit' from source: magic vars 44842 1727204510.59320: variable 'omit' from source: magic vars 44842 1727204510.59369: variable 'omit' from source: magic vars 44842 1727204510.59425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204510.59482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204510.59510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204510.59542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.59562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.59603: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204510.59611: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.59618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.59752: Set connection var ansible_shell_type to sh 44842 1727204510.59775: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204510.59784: Set connection var ansible_connection to ssh 44842 1727204510.59792: Set connection var ansible_pipelining to False 44842 1727204510.59801: Set connection var ansible_timeout to 10 44842 1727204510.59811: Set connection var ansible_shell_executable to /bin/sh 44842 1727204510.59837: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.59845: variable 'ansible_connection' from source: unknown 44842 1727204510.59858: variable 'ansible_module_compression' from source: unknown 44842 1727204510.59871: variable 'ansible_shell_type' from source: unknown 44842 1727204510.59879: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.59885: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.59893: variable 'ansible_pipelining' from source: unknown 44842 1727204510.59901: variable 'ansible_timeout' from source: unknown 44842 1727204510.59909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.60073: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204510.60093: variable 'omit' from source: magic vars 44842 1727204510.60104: starting attempt loop 44842 1727204510.60110: running the handler 44842 1727204510.60298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204510.60572: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204510.60639: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204510.60748: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204510.60792: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204510.60938: variable 'route_rule_table_30200' from source: set_fact 44842 1727204510.60988: Evaluated conditional (route_rule_table_30200.stdout is search("30200:(\s+)from 198.51.100.58/26 lookup 30200")): True 44842 1727204510.61197: variable 'route_rule_table_30200' from source: set_fact 44842 1727204510.61249: Evaluated conditional (route_rule_table_30200.stdout is search("30201:(\s+)from all fwmark 0x1/0x1 lookup 30200")): True 44842 1727204510.61468: variable 'route_rule_table_30200' from source: set_fact 44842 1727204510.61503: Evaluated conditional (route_rule_table_30200.stdout is search("30202:(\s+)from all ipproto tcp lookup 30200")): True 44842 1727204510.61690: variable 'route_rule_table_30200' from source: set_fact 44842 1727204510.61725: Evaluated conditional (route_rule_table_30200.stdout is search("30203:(\s+)from all sport 128-256 lookup 30200")): True 44842 1727204510.61880: variable 'route_rule_table_30200' from source: set_fact 44842 1727204510.61915: Evaluated conditional (route_rule_table_30200.stdout is search("30204:(\s+)from all tos (0x08|throughput) lookup 30200")): True 44842 1727204510.61936: handler run complete 44842 1727204510.61968: attempt loop complete, returning result 44842 1727204510.61977: _execute() done 44842 1727204510.61984: dumping result to json 44842 1727204510.61990: done dumping result, returning 44842 1727204510.62001: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule [0affcd87-79f5-aad0-d242-000000000062] 44842 1727204510.62010: sending task result for task 0affcd87-79f5-aad0-d242-000000000062 44842 1727204510.62133: done sending task result for task 0affcd87-79f5-aad0-d242-000000000062 44842 1727204510.62141: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204510.62197: no more pending results, returning what we have 44842 1727204510.62201: results queue empty 44842 1727204510.62202: checking for any_errors_fatal 44842 1727204510.62210: done checking for any_errors_fatal 44842 1727204510.62211: checking for max_fail_percentage 44842 1727204510.62212: done checking for max_fail_percentage 44842 1727204510.62213: checking to see if all hosts have failed and the running result is not ok 44842 1727204510.62214: done checking to see if all hosts have failed 44842 1727204510.62215: getting the remaining hosts for this loop 44842 1727204510.62216: done getting the remaining hosts for this loop 44842 1727204510.62221: getting the next task for host managed-node1 44842 1727204510.62227: done getting next task for host managed-node1 44842 1727204510.62230: ^ task is: TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 44842 1727204510.62231: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204510.62234: getting variables 44842 1727204510.62236: in VariableManager get_vars() 44842 1727204510.62283: Calling all_inventory to load vars for managed-node1 44842 1727204510.62286: Calling groups_inventory to load vars for managed-node1 44842 1727204510.62288: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204510.62305: Calling all_plugins_play to load vars for managed-node1 44842 1727204510.62307: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204510.62309: Calling groups_plugins_play to load vars for managed-node1 44842 1727204510.63903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204510.67345: done with get_vars() 44842 1727204510.67379: done getting variables 44842 1727204510.67458: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30400 matches the specified rule] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:166 Tuesday 24 September 2024 15:01:50 -0400 (0:00:00.107) 0:00:20.843 ***** 44842 1727204510.67497: entering _queue_task() for managed-node1/assert 44842 1727204510.67954: worker is 1 (out of 1 available) 44842 1727204510.67971: exiting _queue_task() for managed-node1/assert 44842 1727204510.67984: done queuing things up, now waiting for results queue to drain 44842 1727204510.67985: waiting for pending results... 44842 1727204510.68316: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 44842 1727204510.68436: in run() - task 0affcd87-79f5-aad0-d242-000000000063 44842 1727204510.68462: variable 'ansible_search_path' from source: unknown 44842 1727204510.68517: calling self._execute() 44842 1727204510.68641: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.68658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.68679: variable 'omit' from source: magic vars 44842 1727204510.69112: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.69142: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204510.71642: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.71658: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204510.71679: variable 'omit' from source: magic vars 44842 1727204510.71718: variable 'omit' from source: magic vars 44842 1727204510.71770: variable 'omit' from source: magic vars 44842 1727204510.71832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204510.71875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204510.71915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204510.71938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.71955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.72198: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204510.72223: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.72232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.72555: Set connection var ansible_shell_type to sh 44842 1727204510.72576: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204510.72586: Set connection var ansible_connection to ssh 44842 1727204510.72595: Set connection var ansible_pipelining to False 44842 1727204510.72612: Set connection var ansible_timeout to 10 44842 1727204510.72623: Set connection var ansible_shell_executable to /bin/sh 44842 1727204510.72656: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.72668: variable 'ansible_connection' from source: unknown 44842 1727204510.72674: variable 'ansible_module_compression' from source: unknown 44842 1727204510.72680: variable 'ansible_shell_type' from source: unknown 44842 1727204510.72686: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.72691: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.72697: variable 'ansible_pipelining' from source: unknown 44842 1727204510.72703: variable 'ansible_timeout' from source: unknown 44842 1727204510.72709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.72884: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204510.72900: variable 'omit' from source: magic vars 44842 1727204510.72909: starting attempt loop 44842 1727204510.72915: running the handler 44842 1727204510.73111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204510.73613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204510.73671: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204510.74074: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204510.74116: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204510.74454: variable 'route_rule_table_30400' from source: set_fact 44842 1727204510.74522: Evaluated conditional (route_rule_table_30400.stdout is search("30400:(\s+)from all to 198.51.100.128/26 lookup 30400")): True 44842 1727204510.74935: variable 'route_rule_table_30400' from source: set_fact 44842 1727204510.74975: Evaluated conditional (route_rule_table_30400.stdout is search("30401:(\s+)from all iif iiftest \[detached\] lookup 30400")): True 44842 1727204510.75212: variable 'route_rule_table_30400' from source: set_fact 44842 1727204510.75251: Evaluated conditional (route_rule_table_30400.stdout is search("30402:(\s+)from all oif oiftest \[detached\] lookup 30400")): True 44842 1727204510.75275: handler run complete 44842 1727204510.75295: attempt loop complete, returning result 44842 1727204510.75301: _execute() done 44842 1727204510.75308: dumping result to json 44842 1727204510.75316: done dumping result, returning 44842 1727204510.75328: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule [0affcd87-79f5-aad0-d242-000000000063] 44842 1727204510.75343: sending task result for task 0affcd87-79f5-aad0-d242-000000000063 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204510.75527: no more pending results, returning what we have 44842 1727204510.75531: results queue empty 44842 1727204510.75532: checking for any_errors_fatal 44842 1727204510.75538: done checking for any_errors_fatal 44842 1727204510.75539: checking for max_fail_percentage 44842 1727204510.75541: done checking for max_fail_percentage 44842 1727204510.75542: checking to see if all hosts have failed and the running result is not ok 44842 1727204510.75542: done checking to see if all hosts have failed 44842 1727204510.75543: getting the remaining hosts for this loop 44842 1727204510.75545: done getting the remaining hosts for this loop 44842 1727204510.75549: getting the next task for host managed-node1 44842 1727204510.75558: done getting next task for host managed-node1 44842 1727204510.75563: ^ task is: TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 44842 1727204510.75567: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204510.75571: getting variables 44842 1727204510.75573: in VariableManager get_vars() 44842 1727204510.75617: Calling all_inventory to load vars for managed-node1 44842 1727204510.75620: Calling groups_inventory to load vars for managed-node1 44842 1727204510.75623: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204510.75634: Calling all_plugins_play to load vars for managed-node1 44842 1727204510.75637: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204510.75641: Calling groups_plugins_play to load vars for managed-node1 44842 1727204510.77456: done sending task result for task 0affcd87-79f5-aad0-d242-000000000063 44842 1727204510.77462: WORKER PROCESS EXITING 44842 1727204510.79390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204510.82715: done with get_vars() 44842 1727204510.82772: done getting variables 44842 1727204510.83116: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30600 matches the specified rule] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:175 Tuesday 24 September 2024 15:01:50 -0400 (0:00:00.156) 0:00:20.999 ***** 44842 1727204510.83146: entering _queue_task() for managed-node1/assert 44842 1727204510.84358: worker is 1 (out of 1 available) 44842 1727204510.84410: exiting _queue_task() for managed-node1/assert 44842 1727204510.84636: done queuing things up, now waiting for results queue to drain 44842 1727204510.84638: waiting for pending results... 44842 1727204510.86697: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 44842 1727204510.87077: in run() - task 0affcd87-79f5-aad0-d242-000000000064 44842 1727204510.87083: variable 'ansible_search_path' from source: unknown 44842 1727204510.87106: calling self._execute() 44842 1727204510.87325: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.87329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.87338: variable 'omit' from source: magic vars 44842 1727204510.88225: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.88238: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204510.88415: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.88432: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204510.88447: variable 'omit' from source: magic vars 44842 1727204510.88480: variable 'omit' from source: magic vars 44842 1727204510.88538: variable 'omit' from source: magic vars 44842 1727204510.88591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204510.88643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204510.88676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204510.88700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.88719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.88767: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204510.88777: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.88784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.88893: Set connection var ansible_shell_type to sh 44842 1727204510.88908: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204510.88916: Set connection var ansible_connection to ssh 44842 1727204510.88925: Set connection var ansible_pipelining to False 44842 1727204510.88933: Set connection var ansible_timeout to 10 44842 1727204510.88950: Set connection var ansible_shell_executable to /bin/sh 44842 1727204510.88986: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.88994: variable 'ansible_connection' from source: unknown 44842 1727204510.88999: variable 'ansible_module_compression' from source: unknown 44842 1727204510.89005: variable 'ansible_shell_type' from source: unknown 44842 1727204510.89011: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.89016: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.89022: variable 'ansible_pipelining' from source: unknown 44842 1727204510.89027: variable 'ansible_timeout' from source: unknown 44842 1727204510.89066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.89407: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204510.89433: variable 'omit' from source: magic vars 44842 1727204510.89453: starting attempt loop 44842 1727204510.89481: running the handler 44842 1727204510.89819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204510.90328: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204510.90406: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204510.90591: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204510.90680: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204510.90798: variable 'route_rule_table_30600' from source: set_fact 44842 1727204510.90849: Evaluated conditional (route_rule_table_30600.stdout is search("30600:(\s+)from all to 2001:db8::4/32 lookup 30600")): True 44842 1727204510.91135: variable 'route_rule_table_30600' from source: set_fact 44842 1727204510.91172: Evaluated conditional (route_rule_table_30600.stdout is search("30601:(\s+)not from all dport 128-256 lookup 30600")): True 44842 1727204510.91187: handler run complete 44842 1727204510.91205: attempt loop complete, returning result 44842 1727204510.91211: _execute() done 44842 1727204510.91218: dumping result to json 44842 1727204510.91225: done dumping result, returning 44842 1727204510.91243: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule [0affcd87-79f5-aad0-d242-000000000064] 44842 1727204510.91253: sending task result for task 0affcd87-79f5-aad0-d242-000000000064 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204510.91518: no more pending results, returning what we have 44842 1727204510.91522: results queue empty 44842 1727204510.91523: checking for any_errors_fatal 44842 1727204510.91531: done checking for any_errors_fatal 44842 1727204510.91531: checking for max_fail_percentage 44842 1727204510.91534: done checking for max_fail_percentage 44842 1727204510.91537: checking to see if all hosts have failed and the running result is not ok 44842 1727204510.91554: done checking to see if all hosts have failed 44842 1727204510.91568: getting the remaining hosts for this loop 44842 1727204510.91572: done getting the remaining hosts for this loop 44842 1727204510.91585: getting the next task for host managed-node1 44842 1727204510.91620: done getting next task for host managed-node1 44842 1727204510.91625: ^ task is: TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 44842 1727204510.91627: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204510.91631: getting variables 44842 1727204510.91634: in VariableManager get_vars() 44842 1727204510.91690: Calling all_inventory to load vars for managed-node1 44842 1727204510.91694: Calling groups_inventory to load vars for managed-node1 44842 1727204510.91697: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204510.91722: Calling all_plugins_play to load vars for managed-node1 44842 1727204510.91726: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204510.91731: Calling groups_plugins_play to load vars for managed-node1 44842 1727204510.92737: done sending task result for task 0affcd87-79f5-aad0-d242-000000000064 44842 1727204510.92741: WORKER PROCESS EXITING 44842 1727204510.93296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204510.94289: done with get_vars() 44842 1727204510.94307: done getting variables 44842 1727204510.94353: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with 'custom' table lookup matches the specified rule] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:183 Tuesday 24 September 2024 15:01:50 -0400 (0:00:00.112) 0:00:21.111 ***** 44842 1727204510.94381: entering _queue_task() for managed-node1/assert 44842 1727204510.94622: worker is 1 (out of 1 available) 44842 1727204510.94638: exiting _queue_task() for managed-node1/assert 44842 1727204510.94650: done queuing things up, now waiting for results queue to drain 44842 1727204510.94651: waiting for pending results... 44842 1727204510.94850: running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 44842 1727204510.94933: in run() - task 0affcd87-79f5-aad0-d242-000000000065 44842 1727204510.94968: variable 'ansible_search_path' from source: unknown 44842 1727204510.95015: calling self._execute() 44842 1727204510.95172: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.95179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.95188: variable 'omit' from source: magic vars 44842 1727204510.95677: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.95703: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204510.95879: variable 'ansible_distribution_major_version' from source: facts 44842 1727204510.95889: Evaluated conditional (ansible_distribution_major_version != "7"): True 44842 1727204510.95900: variable 'omit' from source: magic vars 44842 1727204510.95952: variable 'omit' from source: magic vars 44842 1727204510.96021: variable 'omit' from source: magic vars 44842 1727204510.96145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204510.96223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204510.96272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204510.96295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.96310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204510.96377: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204510.96387: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.96431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.96632: Set connection var ansible_shell_type to sh 44842 1727204510.96650: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204510.96661: Set connection var ansible_connection to ssh 44842 1727204510.96695: Set connection var ansible_pipelining to False 44842 1727204510.96711: Set connection var ansible_timeout to 10 44842 1727204510.96741: Set connection var ansible_shell_executable to /bin/sh 44842 1727204510.96781: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.96784: variable 'ansible_connection' from source: unknown 44842 1727204510.96788: variable 'ansible_module_compression' from source: unknown 44842 1727204510.96790: variable 'ansible_shell_type' from source: unknown 44842 1727204510.96795: variable 'ansible_shell_executable' from source: unknown 44842 1727204510.96798: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204510.96802: variable 'ansible_pipelining' from source: unknown 44842 1727204510.96805: variable 'ansible_timeout' from source: unknown 44842 1727204510.96808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204510.96919: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204510.96929: variable 'omit' from source: magic vars 44842 1727204510.96934: starting attempt loop 44842 1727204510.96937: running the handler 44842 1727204510.97059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204510.97248: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204510.97305: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204510.97415: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204510.97466: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204510.97563: variable 'route_rule_table_custom' from source: set_fact 44842 1727204510.97589: Evaluated conditional (route_rule_table_custom.stdout is search("200:(\s+)from 198.51.100.56/26 lookup custom")): True 44842 1727204510.97594: handler run complete 44842 1727204510.97609: attempt loop complete, returning result 44842 1727204510.97630: _execute() done 44842 1727204510.97634: dumping result to json 44842 1727204510.97636: done dumping result, returning 44842 1727204510.97640: done running TaskExecutor() for managed-node1/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule [0affcd87-79f5-aad0-d242-000000000065] 44842 1727204510.97642: sending task result for task 0affcd87-79f5-aad0-d242-000000000065 44842 1727204510.97749: done sending task result for task 0affcd87-79f5-aad0-d242-000000000065 44842 1727204510.97751: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204510.97801: no more pending results, returning what we have 44842 1727204510.97806: results queue empty 44842 1727204510.97807: checking for any_errors_fatal 44842 1727204510.97814: done checking for any_errors_fatal 44842 1727204510.97815: checking for max_fail_percentage 44842 1727204510.97816: done checking for max_fail_percentage 44842 1727204510.97817: checking to see if all hosts have failed and the running result is not ok 44842 1727204510.97818: done checking to see if all hosts have failed 44842 1727204510.97819: getting the remaining hosts for this loop 44842 1727204510.97821: done getting the remaining hosts for this loop 44842 1727204510.97824: getting the next task for host managed-node1 44842 1727204510.97830: done getting next task for host managed-node1 44842 1727204510.97834: ^ task is: TASK: Assert that the specified IPv4 routing rule was configured in the connection "{{ interface }}" 44842 1727204510.97835: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204510.97838: getting variables 44842 1727204510.97840: in VariableManager get_vars() 44842 1727204510.97883: Calling all_inventory to load vars for managed-node1 44842 1727204510.97886: Calling groups_inventory to load vars for managed-node1 44842 1727204510.97888: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204510.97898: Calling all_plugins_play to load vars for managed-node1 44842 1727204510.97900: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204510.97902: Calling groups_plugins_play to load vars for managed-node1 44842 1727204510.99130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.01068: done with get_vars() 44842 1727204511.01103: done getting variables 44842 1727204511.01194: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204511.01315: variable 'interface' from source: set_fact TASK [Assert that the specified IPv4 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:190 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.069) 0:00:21.181 ***** 44842 1727204511.01359: entering _queue_task() for managed-node1/assert 44842 1727204511.01691: worker is 1 (out of 1 available) 44842 1727204511.01706: exiting _queue_task() for managed-node1/assert 44842 1727204511.01735: done queuing things up, now waiting for results queue to drain 44842 1727204511.01737: waiting for pending results... 44842 1727204511.01971: running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" 44842 1727204511.02035: in run() - task 0affcd87-79f5-aad0-d242-000000000066 44842 1727204511.02050: variable 'ansible_search_path' from source: unknown 44842 1727204511.02085: calling self._execute() 44842 1727204511.02159: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.02167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.02177: variable 'omit' from source: magic vars 44842 1727204511.02472: variable 'ansible_distribution_major_version' from source: facts 44842 1727204511.02482: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204511.02488: variable 'omit' from source: magic vars 44842 1727204511.02505: variable 'omit' from source: magic vars 44842 1727204511.02584: variable 'interface' from source: set_fact 44842 1727204511.02598: variable 'omit' from source: magic vars 44842 1727204511.02633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204511.02660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204511.02683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204511.02702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.02709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.02743: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204511.02751: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.02754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.02825: Set connection var ansible_shell_type to sh 44842 1727204511.02834: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204511.02839: Set connection var ansible_connection to ssh 44842 1727204511.02844: Set connection var ansible_pipelining to False 44842 1727204511.02849: Set connection var ansible_timeout to 10 44842 1727204511.02857: Set connection var ansible_shell_executable to /bin/sh 44842 1727204511.02878: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.02881: variable 'ansible_connection' from source: unknown 44842 1727204511.02884: variable 'ansible_module_compression' from source: unknown 44842 1727204511.02887: variable 'ansible_shell_type' from source: unknown 44842 1727204511.02890: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.02892: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.02894: variable 'ansible_pipelining' from source: unknown 44842 1727204511.02896: variable 'ansible_timeout' from source: unknown 44842 1727204511.02900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.03004: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204511.03014: variable 'omit' from source: magic vars 44842 1727204511.03019: starting attempt loop 44842 1727204511.03023: running the handler 44842 1727204511.03158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204511.03713: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204511.03787: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204511.03978: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204511.04027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204511.04213: variable 'connection_route_rule' from source: set_fact 44842 1727204511.04274: Evaluated conditional (connection_route_rule.stdout is search("priority 30200 from 198.51.100.58/26 table 30200")): True 44842 1727204511.04545: variable 'connection_route_rule' from source: set_fact 44842 1727204511.04616: Evaluated conditional (connection_route_rule.stdout is search("priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200")): True 44842 1727204511.04817: variable 'connection_route_rule' from source: set_fact 44842 1727204511.04846: Evaluated conditional (connection_route_rule.stdout is search("priority 30202 from 0.0.0.0/0 ipproto 6 table 30200")): True 44842 1727204511.05054: variable 'connection_route_rule' from source: set_fact 44842 1727204511.05125: Evaluated conditional (connection_route_rule.stdout is search("priority 30203 from 0.0.0.0/0 sport 128-256 table 30200")): True 44842 1727204511.05316: variable 'connection_route_rule' from source: set_fact 44842 1727204511.05370: Evaluated conditional (connection_route_rule.stdout is search("priority 30204 from 0.0.0.0/0 tos 0x08 table 30200")): True 44842 1727204511.05517: variable 'connection_route_rule' from source: set_fact 44842 1727204511.05533: Evaluated conditional (connection_route_rule.stdout is search("priority 30400 to 198.51.100.128/26 table 30400")): True 44842 1727204511.05627: variable 'connection_route_rule' from source: set_fact 44842 1727204511.05645: Evaluated conditional (connection_route_rule.stdout is search("priority 30401 from 0.0.0.0/0 iif iiftest table 30400")): True 44842 1727204511.05737: variable 'connection_route_rule' from source: set_fact 44842 1727204511.05753: Evaluated conditional (connection_route_rule.stdout is search("priority 30402 from 0.0.0.0/0 oif oiftest table 30400")): True 44842 1727204511.05862: variable 'connection_route_rule' from source: set_fact 44842 1727204511.05884: Evaluated conditional (connection_route_rule.stdout is search("priority 30403 from 0.0.0.0/0 table 30400")): True 44842 1727204511.05969: variable 'connection_route_rule' from source: set_fact 44842 1727204511.05989: Evaluated conditional (connection_route_rule.stdout is search("priority 200 from 198.51.100.56/26 table 200")): True 44842 1727204511.06016: handler run complete 44842 1727204511.06022: attempt loop complete, returning result 44842 1727204511.06025: _execute() done 44842 1727204511.06027: dumping result to json 44842 1727204511.06032: done dumping result, returning 44842 1727204511.06040: done running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" [0affcd87-79f5-aad0-d242-000000000066] 44842 1727204511.06045: sending task result for task 0affcd87-79f5-aad0-d242-000000000066 44842 1727204511.06145: done sending task result for task 0affcd87-79f5-aad0-d242-000000000066 44842 1727204511.06148: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204511.06197: no more pending results, returning what we have 44842 1727204511.06201: results queue empty 44842 1727204511.06202: checking for any_errors_fatal 44842 1727204511.06208: done checking for any_errors_fatal 44842 1727204511.06209: checking for max_fail_percentage 44842 1727204511.06211: done checking for max_fail_percentage 44842 1727204511.06212: checking to see if all hosts have failed and the running result is not ok 44842 1727204511.06213: done checking to see if all hosts have failed 44842 1727204511.06220: getting the remaining hosts for this loop 44842 1727204511.06222: done getting the remaining hosts for this loop 44842 1727204511.06225: getting the next task for host managed-node1 44842 1727204511.06232: done getting next task for host managed-node1 44842 1727204511.06235: ^ task is: TASK: Assert that the specified IPv6 routing rule was configured in the connection "{{ interface }}" 44842 1727204511.06237: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204511.06240: getting variables 44842 1727204511.06242: in VariableManager get_vars() 44842 1727204511.06283: Calling all_inventory to load vars for managed-node1 44842 1727204511.06286: Calling groups_inventory to load vars for managed-node1 44842 1727204511.06288: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204511.06299: Calling all_plugins_play to load vars for managed-node1 44842 1727204511.06301: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204511.06303: Calling groups_plugins_play to load vars for managed-node1 44842 1727204511.07512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.08793: done with get_vars() 44842 1727204511.08819: done getting variables 44842 1727204511.08898: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204511.09077: variable 'interface' from source: set_fact TASK [Assert that the specified IPv6 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:205 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.077) 0:00:21.259 ***** 44842 1727204511.09107: entering _queue_task() for managed-node1/assert 44842 1727204511.09485: worker is 1 (out of 1 available) 44842 1727204511.09501: exiting _queue_task() for managed-node1/assert 44842 1727204511.09742: done queuing things up, now waiting for results queue to drain 44842 1727204511.09744: waiting for pending results... 44842 1727204511.09850: running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" 44842 1727204511.09937: in run() - task 0affcd87-79f5-aad0-d242-000000000067 44842 1727204511.09950: variable 'ansible_search_path' from source: unknown 44842 1727204511.10005: calling self._execute() 44842 1727204511.10113: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.10117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.10166: variable 'omit' from source: magic vars 44842 1727204511.10715: variable 'ansible_distribution_major_version' from source: facts 44842 1727204511.10728: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204511.10735: variable 'omit' from source: magic vars 44842 1727204511.10755: variable 'omit' from source: magic vars 44842 1727204511.10862: variable 'interface' from source: set_fact 44842 1727204511.10894: variable 'omit' from source: magic vars 44842 1727204511.10936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204511.10976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204511.11000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204511.11020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.11027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.11060: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204511.11069: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.11076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.11223: Set connection var ansible_shell_type to sh 44842 1727204511.11250: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204511.11278: Set connection var ansible_connection to ssh 44842 1727204511.11295: Set connection var ansible_pipelining to False 44842 1727204511.11298: Set connection var ansible_timeout to 10 44842 1727204511.11309: Set connection var ansible_shell_executable to /bin/sh 44842 1727204511.11351: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.11353: variable 'ansible_connection' from source: unknown 44842 1727204511.11358: variable 'ansible_module_compression' from source: unknown 44842 1727204511.11360: variable 'ansible_shell_type' from source: unknown 44842 1727204511.11383: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.11386: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.11388: variable 'ansible_pipelining' from source: unknown 44842 1727204511.11390: variable 'ansible_timeout' from source: unknown 44842 1727204511.11427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.11603: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204511.11627: variable 'omit' from source: magic vars 44842 1727204511.11632: starting attempt loop 44842 1727204511.11635: running the handler 44842 1727204511.11895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204511.12070: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204511.12108: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204511.12169: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204511.12199: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204511.12285: variable 'connection_route_rule6' from source: set_fact 44842 1727204511.12307: Evaluated conditional (connection_route_rule6.stdout is search("priority 30600 to 2001:db8::4/32 table 30600")): True 44842 1727204511.12462: variable 'connection_route_rule6' from source: set_fact 44842 1727204511.12487: Evaluated conditional (connection_route_rule6.stdout is search("priority 30601 not from ::/0 dport 128-256 table 30600") or connection_route_rule6.stdout is search("not priority 30601 from ::/0 dport 128-256 table 30600")): True 44842 1727204511.12584: variable 'connection_route_rule6' from source: set_fact 44842 1727204511.12600: Evaluated conditional (connection_route_rule6.stdout is search("priority 30602 from ::/0 table 30600")): True 44842 1727204511.12611: handler run complete 44842 1727204511.12635: attempt loop complete, returning result 44842 1727204511.12641: _execute() done 44842 1727204511.12644: dumping result to json 44842 1727204511.12646: done dumping result, returning 44842 1727204511.12657: done running TaskExecutor() for managed-node1/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" [0affcd87-79f5-aad0-d242-000000000067] 44842 1727204511.12697: sending task result for task 0affcd87-79f5-aad0-d242-000000000067 44842 1727204511.12774: done sending task result for task 0affcd87-79f5-aad0-d242-000000000067 44842 1727204511.12777: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204511.12828: no more pending results, returning what we have 44842 1727204511.12831: results queue empty 44842 1727204511.12832: checking for any_errors_fatal 44842 1727204511.12843: done checking for any_errors_fatal 44842 1727204511.12843: checking for max_fail_percentage 44842 1727204511.12846: done checking for max_fail_percentage 44842 1727204511.12847: checking to see if all hosts have failed and the running result is not ok 44842 1727204511.12847: done checking to see if all hosts have failed 44842 1727204511.12848: getting the remaining hosts for this loop 44842 1727204511.12850: done getting the remaining hosts for this loop 44842 1727204511.12854: getting the next task for host managed-node1 44842 1727204511.12860: done getting next task for host managed-node1 44842 1727204511.12863: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 44842 1727204511.12866: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204511.12870: getting variables 44842 1727204511.12872: in VariableManager get_vars() 44842 1727204511.12912: Calling all_inventory to load vars for managed-node1 44842 1727204511.12915: Calling groups_inventory to load vars for managed-node1 44842 1727204511.12917: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204511.12927: Calling all_plugins_play to load vars for managed-node1 44842 1727204511.12930: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204511.12933: Calling groups_plugins_play to load vars for managed-node1 44842 1727204511.13835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.15154: done with get_vars() 44842 1727204511.15186: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:213 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.061) 0:00:21.320 ***** 44842 1727204511.15288: entering _queue_task() for managed-node1/file 44842 1727204511.15663: worker is 1 (out of 1 available) 44842 1727204511.15681: exiting _queue_task() for managed-node1/file 44842 1727204511.15693: done queuing things up, now waiting for results queue to drain 44842 1727204511.15695: waiting for pending results... 44842 1727204511.15886: running TaskExecutor() for managed-node1/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 44842 1727204511.15950: in run() - task 0affcd87-79f5-aad0-d242-000000000068 44842 1727204511.15968: variable 'ansible_search_path' from source: unknown 44842 1727204511.15998: calling self._execute() 44842 1727204511.16082: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.16086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.16093: variable 'omit' from source: magic vars 44842 1727204511.16371: variable 'ansible_distribution_major_version' from source: facts 44842 1727204511.16381: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204511.16388: variable 'omit' from source: magic vars 44842 1727204511.16410: variable 'omit' from source: magic vars 44842 1727204511.16431: variable 'omit' from source: magic vars 44842 1727204511.16467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204511.16493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204511.16514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204511.16528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.16537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.16563: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204511.16569: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.16571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.16638: Set connection var ansible_shell_type to sh 44842 1727204511.16646: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204511.16651: Set connection var ansible_connection to ssh 44842 1727204511.16656: Set connection var ansible_pipelining to False 44842 1727204511.16665: Set connection var ansible_timeout to 10 44842 1727204511.16670: Set connection var ansible_shell_executable to /bin/sh 44842 1727204511.16687: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.16690: variable 'ansible_connection' from source: unknown 44842 1727204511.16692: variable 'ansible_module_compression' from source: unknown 44842 1727204511.16695: variable 'ansible_shell_type' from source: unknown 44842 1727204511.16697: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.16713: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.16717: variable 'ansible_pipelining' from source: unknown 44842 1727204511.16720: variable 'ansible_timeout' from source: unknown 44842 1727204511.16751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.17032: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204511.17040: variable 'omit' from source: magic vars 44842 1727204511.17045: starting attempt loop 44842 1727204511.17048: running the handler 44842 1727204511.17070: _low_level_execute_command(): starting 44842 1727204511.17076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204511.17697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.17719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.17734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44842 1727204511.17754: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.17796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.17809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.17881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.19533: stdout chunk (state=3): >>>/root <<< 44842 1727204511.19638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.19704: stderr chunk (state=3): >>><<< 44842 1727204511.19708: stdout chunk (state=3): >>><<< 44842 1727204511.19730: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204511.19741: _low_level_execute_command(): starting 44842 1727204511.19748: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350 `" && echo ansible-tmp-1727204511.1972976-46541-206211368434350="` echo /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350 `" ) && sleep 0' 44842 1727204511.20223: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.20230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.20259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.20278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204511.20297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.20341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.20348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.20439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.22294: stdout chunk (state=3): >>>ansible-tmp-1727204511.1972976-46541-206211368434350=/root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350 <<< 44842 1727204511.22399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.22453: stderr chunk (state=3): >>><<< 44842 1727204511.22459: stdout chunk (state=3): >>><<< 44842 1727204511.22481: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204511.1972976-46541-206211368434350=/root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204511.22520: variable 'ansible_module_compression' from source: unknown 44842 1727204511.22571: ANSIBALLZ: Using lock for file 44842 1727204511.22575: ANSIBALLZ: Acquiring lock 44842 1727204511.22578: ANSIBALLZ: Lock acquired: 140164881038272 44842 1727204511.22580: ANSIBALLZ: Creating module 44842 1727204511.32311: ANSIBALLZ: Writing module into payload 44842 1727204511.32522: ANSIBALLZ: Writing module 44842 1727204511.32551: ANSIBALLZ: Renaming module 44842 1727204511.32567: ANSIBALLZ: Done creating module 44842 1727204511.32591: variable 'ansible_facts' from source: unknown 44842 1727204511.32683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350/AnsiballZ_file.py 44842 1727204511.33488: Sending initial data 44842 1727204511.33497: Sent initial data (153 bytes) 44842 1727204511.33952: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.33956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.33989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204511.33993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.34006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.34076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204511.34089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.34178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.35893: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204511.35952: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204511.36005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpyegozukb /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350/AnsiballZ_file.py <<< 44842 1727204511.36059: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204511.37241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.37428: stderr chunk (state=3): >>><<< 44842 1727204511.37432: stdout chunk (state=3): >>><<< 44842 1727204511.37434: done transferring module to remote 44842 1727204511.37436: _low_level_execute_command(): starting 44842 1727204511.37439: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350/ /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350/AnsiballZ_file.py && sleep 0' 44842 1727204511.38026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204511.38044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.38060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.38081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.38126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.38143: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204511.38158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.38178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204511.38190: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204511.38201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204511.38214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.38228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.38245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.38259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.38273: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204511.38287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.38368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.38386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204511.38400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.38501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.40213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.40311: stderr chunk (state=3): >>><<< 44842 1727204511.40315: stdout chunk (state=3): >>><<< 44842 1727204511.40413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204511.40418: _low_level_execute_command(): starting 44842 1727204511.40421: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350/AnsiballZ_file.py && sleep 0' 44842 1727204511.40994: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204511.41010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.41026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.41045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.41091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.41104: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204511.41120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.41138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204511.41150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204511.41161: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204511.41178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.41192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.41208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.41221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.41233: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204511.41249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.41327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.41354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204511.41357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.41460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.55163: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 44842 1727204511.56202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204511.56206: stdout chunk (state=3): >>><<< 44842 1727204511.56208: stderr chunk (state=3): >>><<< 44842 1727204511.56352: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204511.56356: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204511.56366: _low_level_execute_command(): starting 44842 1727204511.56369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204511.1972976-46541-206211368434350/ > /dev/null 2>&1 && sleep 0' 44842 1727204511.56940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.56943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.56970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.56989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204511.56992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.57055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.57068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.57151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.58914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.59004: stderr chunk (state=3): >>><<< 44842 1727204511.59008: stdout chunk (state=3): >>><<< 44842 1727204511.59472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204511.59477: handler run complete 44842 1727204511.59479: attempt loop complete, returning result 44842 1727204511.59481: _execute() done 44842 1727204511.59483: dumping result to json 44842 1727204511.59485: done dumping result, returning 44842 1727204511.59487: done running TaskExecutor() for managed-node1/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [0affcd87-79f5-aad0-d242-000000000068] 44842 1727204511.59489: sending task result for task 0affcd87-79f5-aad0-d242-000000000068 44842 1727204511.59572: done sending task result for task 0affcd87-79f5-aad0-d242-000000000068 44842 1727204511.59576: WORKER PROCESS EXITING changed: [managed-node1] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 44842 1727204511.59645: no more pending results, returning what we have 44842 1727204511.59649: results queue empty 44842 1727204511.59650: checking for any_errors_fatal 44842 1727204511.59657: done checking for any_errors_fatal 44842 1727204511.59658: checking for max_fail_percentage 44842 1727204511.59663: done checking for max_fail_percentage 44842 1727204511.59666: checking to see if all hosts have failed and the running result is not ok 44842 1727204511.59667: done checking to see if all hosts have failed 44842 1727204511.59667: getting the remaining hosts for this loop 44842 1727204511.59669: done getting the remaining hosts for this loop 44842 1727204511.59672: getting the next task for host managed-node1 44842 1727204511.59681: done getting next task for host managed-node1 44842 1727204511.59684: ^ task is: TASK: meta (flush_handlers) 44842 1727204511.59686: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204511.59690: getting variables 44842 1727204511.59692: in VariableManager get_vars() 44842 1727204511.59727: Calling all_inventory to load vars for managed-node1 44842 1727204511.59731: Calling groups_inventory to load vars for managed-node1 44842 1727204511.59733: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204511.59743: Calling all_plugins_play to load vars for managed-node1 44842 1727204511.59745: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204511.59749: Calling groups_plugins_play to load vars for managed-node1 44842 1727204511.61667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.63517: done with get_vars() 44842 1727204511.63548: done getting variables 44842 1727204511.63636: in VariableManager get_vars() 44842 1727204511.63652: Calling all_inventory to load vars for managed-node1 44842 1727204511.63655: Calling groups_inventory to load vars for managed-node1 44842 1727204511.63657: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204511.63667: Calling all_plugins_play to load vars for managed-node1 44842 1727204511.63670: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204511.63673: Calling groups_plugins_play to load vars for managed-node1 44842 1727204511.64950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.65982: done with get_vars() 44842 1727204511.66007: done queuing things up, now waiting for results queue to drain 44842 1727204511.66009: results queue empty 44842 1727204511.66009: checking for any_errors_fatal 44842 1727204511.66012: done checking for any_errors_fatal 44842 1727204511.66012: checking for max_fail_percentage 44842 1727204511.66013: done checking for max_fail_percentage 44842 1727204511.66013: checking to see if all hosts have failed and the running result is not ok 44842 1727204511.66014: done checking to see if all hosts have failed 44842 1727204511.66014: getting the remaining hosts for this loop 44842 1727204511.66015: done getting the remaining hosts for this loop 44842 1727204511.66017: getting the next task for host managed-node1 44842 1727204511.66020: done getting next task for host managed-node1 44842 1727204511.66021: ^ task is: TASK: meta (flush_handlers) 44842 1727204511.66022: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204511.66025: getting variables 44842 1727204511.66025: in VariableManager get_vars() 44842 1727204511.66033: Calling all_inventory to load vars for managed-node1 44842 1727204511.66035: Calling groups_inventory to load vars for managed-node1 44842 1727204511.66041: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204511.66046: Calling all_plugins_play to load vars for managed-node1 44842 1727204511.66047: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204511.66049: Calling groups_plugins_play to load vars for managed-node1 44842 1727204511.66738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.68119: done with get_vars() 44842 1727204511.68144: done getting variables 44842 1727204511.68209: in VariableManager get_vars() 44842 1727204511.68223: Calling all_inventory to load vars for managed-node1 44842 1727204511.68226: Calling groups_inventory to load vars for managed-node1 44842 1727204511.68228: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204511.68233: Calling all_plugins_play to load vars for managed-node1 44842 1727204511.68236: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204511.68239: Calling groups_plugins_play to load vars for managed-node1 44842 1727204511.69174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.70105: done with get_vars() 44842 1727204511.70127: done queuing things up, now waiting for results queue to drain 44842 1727204511.70129: results queue empty 44842 1727204511.70129: checking for any_errors_fatal 44842 1727204511.70130: done checking for any_errors_fatal 44842 1727204511.70130: checking for max_fail_percentage 44842 1727204511.70131: done checking for max_fail_percentage 44842 1727204511.70132: checking to see if all hosts have failed and the running result is not ok 44842 1727204511.70132: done checking to see if all hosts have failed 44842 1727204511.70133: getting the remaining hosts for this loop 44842 1727204511.70133: done getting the remaining hosts for this loop 44842 1727204511.70135: getting the next task for host managed-node1 44842 1727204511.70138: done getting next task for host managed-node1 44842 1727204511.70138: ^ task is: None 44842 1727204511.70139: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204511.70140: done queuing things up, now waiting for results queue to drain 44842 1727204511.70141: results queue empty 44842 1727204511.70141: checking for any_errors_fatal 44842 1727204511.70141: done checking for any_errors_fatal 44842 1727204511.70142: checking for max_fail_percentage 44842 1727204511.70142: done checking for max_fail_percentage 44842 1727204511.70143: checking to see if all hosts have failed and the running result is not ok 44842 1727204511.70143: done checking to see if all hosts have failed 44842 1727204511.70145: getting the next task for host managed-node1 44842 1727204511.70146: done getting next task for host managed-node1 44842 1727204511.70147: ^ task is: None 44842 1727204511.70148: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204511.70196: in VariableManager get_vars() 44842 1727204511.70212: done with get_vars() 44842 1727204511.70215: in VariableManager get_vars() 44842 1727204511.70224: done with get_vars() 44842 1727204511.70227: variable 'omit' from source: magic vars 44842 1727204511.70317: variable 'profile' from source: play vars 44842 1727204511.70403: in VariableManager get_vars() 44842 1727204511.70414: done with get_vars() 44842 1727204511.70429: variable 'omit' from source: magic vars 44842 1727204511.70480: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 44842 1727204511.70909: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44842 1727204511.70929: getting the remaining hosts for this loop 44842 1727204511.70930: done getting the remaining hosts for this loop 44842 1727204511.70932: getting the next task for host managed-node1 44842 1727204511.70934: done getting next task for host managed-node1 44842 1727204511.70935: ^ task is: TASK: Gathering Facts 44842 1727204511.70936: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204511.70938: getting variables 44842 1727204511.70938: in VariableManager get_vars() 44842 1727204511.70997: Calling all_inventory to load vars for managed-node1 44842 1727204511.70999: Calling groups_inventory to load vars for managed-node1 44842 1727204511.71000: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204511.71005: Calling all_plugins_play to load vars for managed-node1 44842 1727204511.71006: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204511.71008: Calling groups_plugins_play to load vars for managed-node1 44842 1727204511.72052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204511.73858: done with get_vars() 44842 1727204511.73886: done getting variables 44842 1727204511.73947: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.586) 0:00:21.907 ***** 44842 1727204511.73976: entering _queue_task() for managed-node1/gather_facts 44842 1727204511.74334: worker is 1 (out of 1 available) 44842 1727204511.74350: exiting _queue_task() for managed-node1/gather_facts 44842 1727204511.74370: done queuing things up, now waiting for results queue to drain 44842 1727204511.74372: waiting for pending results... 44842 1727204511.74671: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44842 1727204511.74810: in run() - task 0affcd87-79f5-aad0-d242-0000000004b1 44842 1727204511.74833: variable 'ansible_search_path' from source: unknown 44842 1727204511.74878: calling self._execute() 44842 1727204511.74995: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.75013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.75034: variable 'omit' from source: magic vars 44842 1727204511.75467: variable 'ansible_distribution_major_version' from source: facts 44842 1727204511.75488: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204511.75500: variable 'omit' from source: magic vars 44842 1727204511.75530: variable 'omit' from source: magic vars 44842 1727204511.75584: variable 'omit' from source: magic vars 44842 1727204511.75631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204511.75687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204511.75715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204511.75739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.75756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204511.75808: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204511.75817: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.75825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.75936: Set connection var ansible_shell_type to sh 44842 1727204511.75950: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204511.75959: Set connection var ansible_connection to ssh 44842 1727204511.75969: Set connection var ansible_pipelining to False 44842 1727204511.75979: Set connection var ansible_timeout to 10 44842 1727204511.75995: Set connection var ansible_shell_executable to /bin/sh 44842 1727204511.76028: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.76036: variable 'ansible_connection' from source: unknown 44842 1727204511.76043: variable 'ansible_module_compression' from source: unknown 44842 1727204511.76049: variable 'ansible_shell_type' from source: unknown 44842 1727204511.76055: variable 'ansible_shell_executable' from source: unknown 44842 1727204511.76060: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204511.76070: variable 'ansible_pipelining' from source: unknown 44842 1727204511.76076: variable 'ansible_timeout' from source: unknown 44842 1727204511.76083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204511.76292: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204511.76312: variable 'omit' from source: magic vars 44842 1727204511.76334: starting attempt loop 44842 1727204511.76343: running the handler 44842 1727204511.76367: variable 'ansible_facts' from source: unknown 44842 1727204511.76392: _low_level_execute_command(): starting 44842 1727204511.76404: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204511.77252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204511.77271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.77287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.77316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.77363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.77379: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204511.77393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.77419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204511.77438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204511.77449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204511.77462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.77480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.77497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.77512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.77527: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204511.77550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.77623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.77646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204511.77662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.77861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.79362: stdout chunk (state=3): >>>/root <<< 44842 1727204511.79562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.79568: stdout chunk (state=3): >>><<< 44842 1727204511.79570: stderr chunk (state=3): >>><<< 44842 1727204511.79696: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204511.79701: _low_level_execute_command(): starting 44842 1727204511.79704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109 `" && echo ansible-tmp-1727204511.795964-46561-275351595400109="` echo /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109 `" ) && sleep 0' 44842 1727204511.80947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.80951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.80985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.80989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204511.80992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204511.80994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.81057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.81060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.81136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.82980: stdout chunk (state=3): >>>ansible-tmp-1727204511.795964-46561-275351595400109=/root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109 <<< 44842 1727204511.83094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.83175: stderr chunk (state=3): >>><<< 44842 1727204511.83179: stdout chunk (state=3): >>><<< 44842 1727204511.83637: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204511.795964-46561-275351595400109=/root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204511.83642: variable 'ansible_module_compression' from source: unknown 44842 1727204511.83645: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44842 1727204511.83648: variable 'ansible_facts' from source: unknown 44842 1727204511.83650: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109/AnsiballZ_setup.py 44842 1727204511.83715: Sending initial data 44842 1727204511.83718: Sent initial data (153 bytes) 44842 1727204511.84670: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204511.84687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.84703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.84721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.84761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.84777: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204511.84790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.84807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204511.84819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204511.84829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204511.84840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.84852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.84868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.84880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.84889: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204511.84903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.84978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.84995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204511.85009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.85104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.86826: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204511.86886: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204511.86931: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpew1q5m5m /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109/AnsiballZ_setup.py <<< 44842 1727204511.86995: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204511.89997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.90183: stderr chunk (state=3): >>><<< 44842 1727204511.90187: stdout chunk (state=3): >>><<< 44842 1727204511.90189: done transferring module to remote 44842 1727204511.90192: _low_level_execute_command(): starting 44842 1727204511.90194: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109/ /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109/AnsiballZ_setup.py && sleep 0' 44842 1727204511.91435: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204511.91492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.91509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.91533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.91584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.91682: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204511.91704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.91725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204511.91737: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204511.91747: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204511.91757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.91777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.91795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.91812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.91827: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204511.91840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.92037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.92057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204511.92075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.92285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204511.94042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204511.94047: stdout chunk (state=3): >>><<< 44842 1727204511.94049: stderr chunk (state=3): >>><<< 44842 1727204511.94151: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204511.94155: _low_level_execute_command(): starting 44842 1727204511.94157: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109/AnsiballZ_setup.py && sleep 0' 44842 1727204511.94892: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204511.94905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.94919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.94935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.94986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.95001: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204511.95015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.95031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204511.95042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204511.95051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204511.95066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204511.95082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204511.95103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204511.95116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204511.95126: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204511.95138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204511.95222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204511.95240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204511.95254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204511.95395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204512.48533: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmI<<< 44842 1727204512.48585: stdout chunk (state=3): >>>n2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2775, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 757, "free": 2775}, "nocache": {"free": 3251, "used": 281}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 775, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271720448, "block_size": 4096, "block_total": 65519355, "block_available": 64519463, "block_used": 999892, "inode_total": 131071472, "inode_available": 130998228, "inode_used": 73244, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "ethtest0", "rpltstbr", "eth0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:d5:21:e5:60:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:21ff:fee5:60c0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "da:d5:74:1e:37:62", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::901d:7b7f:d9f2:e307", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "ge<<< 44842 1727204512.48600: stdout chunk (state=3): >>>neric_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::40d5:21ff:fee5:60c0", "fe80::108f:92ff:fee7:c1ab", "2001:db8::2", "fe80::901d:7b7f:d9f2:e307"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::108f:92ff:fee7:c1ab", "fe80::40d5:21ff:fee5:60c0", "fe80::901d:7b7f:d9f2:e307"]}, "ansible_loadavg": {"1m": 0.4, "5m": 0.43, "15m": 0.28}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "52", "epoch": "1727204512", "epoch_int": "1727204512", "date": "2024-09-24", "time": "15:01:52", "iso8601_micro": "2024-09-24T19:01:52.481164Z", "iso8601": "2024-09-24T19:01:52Z", "iso8601_basic": "20240924T150152481164", "iso8601_basic_short": "20240924T150152", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44842 1727204512.50298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204512.50301: stdout chunk (state=3): >>><<< 44842 1727204512.50304: stderr chunk (state=3): >>><<< 44842 1727204512.50371: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2775, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 757, "free": 2775}, "nocache": {"free": 3251, "used": 281}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 775, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271720448, "block_size": 4096, "block_total": 65519355, "block_available": 64519463, "block_used": 999892, "inode_total": 131071472, "inode_available": 130998228, "inode_used": 73244, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "ethtest0", "rpltstbr", "eth0", "peerethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:d5:21:e5:60:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:21ff:fee5:60c0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "da:d5:74:1e:37:62", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::901d:7b7f:d9f2:e307", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::40d5:21ff:fee5:60c0", "fe80::108f:92ff:fee7:c1ab", "2001:db8::2", "fe80::901d:7b7f:d9f2:e307"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::108f:92ff:fee7:c1ab", "fe80::40d5:21ff:fee5:60c0", "fe80::901d:7b7f:d9f2:e307"]}, "ansible_loadavg": {"1m": 0.4, "5m": 0.43, "15m": 0.28}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "52", "epoch": "1727204512", "epoch_int": "1727204512", "date": "2024-09-24", "time": "15:01:52", "iso8601_micro": "2024-09-24T19:01:52.481164Z", "iso8601": "2024-09-24T19:01:52Z", "iso8601_basic": "20240924T150152481164", "iso8601_basic_short": "20240924T150152", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204512.50989: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204512.51019: _low_level_execute_command(): starting 44842 1727204512.51030: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204511.795964-46561-275351595400109/ > /dev/null 2>&1 && sleep 0' 44842 1727204512.51726: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204512.51740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204512.51758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204512.51782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204512.51825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204512.51837: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204512.51850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204512.51874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204512.51887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204512.51898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204512.51909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204512.51922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204512.51936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204512.51947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204512.51957: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204512.51978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204512.52055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204512.52082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204512.52102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204512.52190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204512.54116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204512.54119: stdout chunk (state=3): >>><<< 44842 1727204512.54122: stderr chunk (state=3): >>><<< 44842 1727204512.54769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204512.54773: handler run complete 44842 1727204512.54776: variable 'ansible_facts' from source: unknown 44842 1727204512.54778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.54852: variable 'ansible_facts' from source: unknown 44842 1727204512.54959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.55130: attempt loop complete, returning result 44842 1727204512.55144: _execute() done 44842 1727204512.55151: dumping result to json 44842 1727204512.55202: done dumping result, returning 44842 1727204512.55216: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-aad0-d242-0000000004b1] 44842 1727204512.55227: sending task result for task 0affcd87-79f5-aad0-d242-0000000004b1 ok: [managed-node1] 44842 1727204512.56229: no more pending results, returning what we have 44842 1727204512.56233: results queue empty 44842 1727204512.56234: checking for any_errors_fatal 44842 1727204512.56235: done checking for any_errors_fatal 44842 1727204512.56236: checking for max_fail_percentage 44842 1727204512.56237: done checking for max_fail_percentage 44842 1727204512.56238: checking to see if all hosts have failed and the running result is not ok 44842 1727204512.56239: done checking to see if all hosts have failed 44842 1727204512.56240: getting the remaining hosts for this loop 44842 1727204512.56241: done getting the remaining hosts for this loop 44842 1727204512.56247: getting the next task for host managed-node1 44842 1727204512.56254: done getting next task for host managed-node1 44842 1727204512.56256: ^ task is: TASK: meta (flush_handlers) 44842 1727204512.56258: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204512.56266: getting variables 44842 1727204512.56269: in VariableManager get_vars() 44842 1727204512.56300: Calling all_inventory to load vars for managed-node1 44842 1727204512.56302: Calling groups_inventory to load vars for managed-node1 44842 1727204512.56304: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.56315: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.56317: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.56319: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.57217: done sending task result for task 0affcd87-79f5-aad0-d242-0000000004b1 44842 1727204512.57221: WORKER PROCESS EXITING 44842 1727204512.57231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.58202: done with get_vars() 44842 1727204512.58220: done getting variables 44842 1727204512.58277: in VariableManager get_vars() 44842 1727204512.58287: Calling all_inventory to load vars for managed-node1 44842 1727204512.58288: Calling groups_inventory to load vars for managed-node1 44842 1727204512.58290: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.58293: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.58294: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.58300: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.59399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.60603: done with get_vars() 44842 1727204512.60626: done queuing things up, now waiting for results queue to drain 44842 1727204512.60628: results queue empty 44842 1727204512.60628: checking for any_errors_fatal 44842 1727204512.60631: done checking for any_errors_fatal 44842 1727204512.60632: checking for max_fail_percentage 44842 1727204512.60632: done checking for max_fail_percentage 44842 1727204512.60633: checking to see if all hosts have failed and the running result is not ok 44842 1727204512.60633: done checking to see if all hosts have failed 44842 1727204512.60634: getting the remaining hosts for this loop 44842 1727204512.60635: done getting the remaining hosts for this loop 44842 1727204512.60638: getting the next task for host managed-node1 44842 1727204512.60641: done getting next task for host managed-node1 44842 1727204512.60643: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44842 1727204512.60645: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204512.60652: getting variables 44842 1727204512.60653: in VariableManager get_vars() 44842 1727204512.60667: Calling all_inventory to load vars for managed-node1 44842 1727204512.60669: Calling groups_inventory to load vars for managed-node1 44842 1727204512.60670: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.60674: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.60676: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.60677: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.61384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.62375: done with get_vars() 44842 1727204512.62391: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.884) 0:00:22.792 ***** 44842 1727204512.62447: entering _queue_task() for managed-node1/include_tasks 44842 1727204512.62715: worker is 1 (out of 1 available) 44842 1727204512.62729: exiting _queue_task() for managed-node1/include_tasks 44842 1727204512.62744: done queuing things up, now waiting for results queue to drain 44842 1727204512.62745: waiting for pending results... 44842 1727204512.63183: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44842 1727204512.63189: in run() - task 0affcd87-79f5-aad0-d242-000000000071 44842 1727204512.63193: variable 'ansible_search_path' from source: unknown 44842 1727204512.63196: variable 'ansible_search_path' from source: unknown 44842 1727204512.63199: calling self._execute() 44842 1727204512.63247: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204512.63251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204512.63262: variable 'omit' from source: magic vars 44842 1727204512.63658: variable 'ansible_distribution_major_version' from source: facts 44842 1727204512.63682: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204512.63699: _execute() done 44842 1727204512.63707: dumping result to json 44842 1727204512.63714: done dumping result, returning 44842 1727204512.63723: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-aad0-d242-000000000071] 44842 1727204512.63733: sending task result for task 0affcd87-79f5-aad0-d242-000000000071 44842 1727204512.63871: no more pending results, returning what we have 44842 1727204512.63877: in VariableManager get_vars() 44842 1727204512.63923: Calling all_inventory to load vars for managed-node1 44842 1727204512.63926: Calling groups_inventory to load vars for managed-node1 44842 1727204512.63929: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.63942: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.63945: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.63948: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.64984: done sending task result for task 0affcd87-79f5-aad0-d242-000000000071 44842 1727204512.64989: WORKER PROCESS EXITING 44842 1727204512.65410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.66381: done with get_vars() 44842 1727204512.66396: variable 'ansible_search_path' from source: unknown 44842 1727204512.66396: variable 'ansible_search_path' from source: unknown 44842 1727204512.66418: we have included files to process 44842 1727204512.66418: generating all_blocks data 44842 1727204512.66419: done generating all_blocks data 44842 1727204512.66420: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204512.66421: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204512.66422: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204512.66826: done processing included file 44842 1727204512.66828: iterating over new_blocks loaded from include file 44842 1727204512.66829: in VariableManager get_vars() 44842 1727204512.66843: done with get_vars() 44842 1727204512.66844: filtering new block on tags 44842 1727204512.66855: done filtering new block on tags 44842 1727204512.66856: in VariableManager get_vars() 44842 1727204512.66870: done with get_vars() 44842 1727204512.66871: filtering new block on tags 44842 1727204512.66884: done filtering new block on tags 44842 1727204512.66885: in VariableManager get_vars() 44842 1727204512.66898: done with get_vars() 44842 1727204512.66900: filtering new block on tags 44842 1727204512.66910: done filtering new block on tags 44842 1727204512.66912: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 44842 1727204512.66916: extending task lists for all hosts with included blocks 44842 1727204512.67141: done extending task lists 44842 1727204512.67142: done processing included files 44842 1727204512.67143: results queue empty 44842 1727204512.67143: checking for any_errors_fatal 44842 1727204512.67144: done checking for any_errors_fatal 44842 1727204512.67145: checking for max_fail_percentage 44842 1727204512.67145: done checking for max_fail_percentage 44842 1727204512.67146: checking to see if all hosts have failed and the running result is not ok 44842 1727204512.67146: done checking to see if all hosts have failed 44842 1727204512.67147: getting the remaining hosts for this loop 44842 1727204512.67148: done getting the remaining hosts for this loop 44842 1727204512.67149: getting the next task for host managed-node1 44842 1727204512.67152: done getting next task for host managed-node1 44842 1727204512.67154: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44842 1727204512.67155: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204512.67165: getting variables 44842 1727204512.67165: in VariableManager get_vars() 44842 1727204512.67175: Calling all_inventory to load vars for managed-node1 44842 1727204512.67176: Calling groups_inventory to load vars for managed-node1 44842 1727204512.67177: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.67181: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.67182: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.67184: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.72267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.73212: done with get_vars() 44842 1727204512.73231: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.108) 0:00:22.901 ***** 44842 1727204512.73289: entering _queue_task() for managed-node1/setup 44842 1727204512.73523: worker is 1 (out of 1 available) 44842 1727204512.73535: exiting _queue_task() for managed-node1/setup 44842 1727204512.73548: done queuing things up, now waiting for results queue to drain 44842 1727204512.73550: waiting for pending results... 44842 1727204512.73732: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44842 1727204512.73834: in run() - task 0affcd87-79f5-aad0-d242-0000000004f2 44842 1727204512.73845: variable 'ansible_search_path' from source: unknown 44842 1727204512.73849: variable 'ansible_search_path' from source: unknown 44842 1727204512.73886: calling self._execute() 44842 1727204512.73982: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204512.73994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204512.74007: variable 'omit' from source: magic vars 44842 1727204512.74374: variable 'ansible_distribution_major_version' from source: facts 44842 1727204512.74394: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204512.74603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204512.76387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204512.76440: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204512.76468: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204512.76499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204512.76519: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204512.76582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204512.76607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204512.76625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204512.76651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204512.76666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204512.76701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204512.76724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204512.76741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204512.76769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204512.76779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204512.76891: variable '__network_required_facts' from source: role '' defaults 44842 1727204512.76900: variable 'ansible_facts' from source: unknown 44842 1727204512.77386: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44842 1727204512.77390: when evaluation is False, skipping this task 44842 1727204512.77393: _execute() done 44842 1727204512.77395: dumping result to json 44842 1727204512.77397: done dumping result, returning 44842 1727204512.77404: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-aad0-d242-0000000004f2] 44842 1727204512.77409: sending task result for task 0affcd87-79f5-aad0-d242-0000000004f2 44842 1727204512.77510: done sending task result for task 0affcd87-79f5-aad0-d242-0000000004f2 44842 1727204512.77513: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204512.77554: no more pending results, returning what we have 44842 1727204512.77558: results queue empty 44842 1727204512.77559: checking for any_errors_fatal 44842 1727204512.77563: done checking for any_errors_fatal 44842 1727204512.77572: checking for max_fail_percentage 44842 1727204512.77574: done checking for max_fail_percentage 44842 1727204512.77575: checking to see if all hosts have failed and the running result is not ok 44842 1727204512.77576: done checking to see if all hosts have failed 44842 1727204512.77576: getting the remaining hosts for this loop 44842 1727204512.77578: done getting the remaining hosts for this loop 44842 1727204512.77582: getting the next task for host managed-node1 44842 1727204512.77592: done getting next task for host managed-node1 44842 1727204512.77597: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44842 1727204512.77599: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204512.77613: getting variables 44842 1727204512.77614: in VariableManager get_vars() 44842 1727204512.77653: Calling all_inventory to load vars for managed-node1 44842 1727204512.77656: Calling groups_inventory to load vars for managed-node1 44842 1727204512.77658: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.77671: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.77674: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.77677: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.78539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.79535: done with get_vars() 44842 1727204512.79552: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.063) 0:00:22.964 ***** 44842 1727204512.79628: entering _queue_task() for managed-node1/stat 44842 1727204512.79880: worker is 1 (out of 1 available) 44842 1727204512.79891: exiting _queue_task() for managed-node1/stat 44842 1727204512.79904: done queuing things up, now waiting for results queue to drain 44842 1727204512.79905: waiting for pending results... 44842 1727204512.80092: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 44842 1727204512.80185: in run() - task 0affcd87-79f5-aad0-d242-0000000004f4 44842 1727204512.80195: variable 'ansible_search_path' from source: unknown 44842 1727204512.80198: variable 'ansible_search_path' from source: unknown 44842 1727204512.80228: calling self._execute() 44842 1727204512.80307: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204512.80311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204512.80319: variable 'omit' from source: magic vars 44842 1727204512.80605: variable 'ansible_distribution_major_version' from source: facts 44842 1727204512.80615: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204512.80739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204512.80943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204512.80981: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204512.81026: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204512.81054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204512.81125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204512.81144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204512.81162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204512.81184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204512.81252: variable '__network_is_ostree' from source: set_fact 44842 1727204512.81258: Evaluated conditional (not __network_is_ostree is defined): False 44842 1727204512.81261: when evaluation is False, skipping this task 44842 1727204512.81269: _execute() done 44842 1727204512.81271: dumping result to json 44842 1727204512.81274: done dumping result, returning 44842 1727204512.81282: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-aad0-d242-0000000004f4] 44842 1727204512.81285: sending task result for task 0affcd87-79f5-aad0-d242-0000000004f4 44842 1727204512.81377: done sending task result for task 0affcd87-79f5-aad0-d242-0000000004f4 44842 1727204512.81379: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44842 1727204512.81430: no more pending results, returning what we have 44842 1727204512.81434: results queue empty 44842 1727204512.81435: checking for any_errors_fatal 44842 1727204512.81441: done checking for any_errors_fatal 44842 1727204512.81441: checking for max_fail_percentage 44842 1727204512.81443: done checking for max_fail_percentage 44842 1727204512.81444: checking to see if all hosts have failed and the running result is not ok 44842 1727204512.81445: done checking to see if all hosts have failed 44842 1727204512.81445: getting the remaining hosts for this loop 44842 1727204512.81447: done getting the remaining hosts for this loop 44842 1727204512.81451: getting the next task for host managed-node1 44842 1727204512.81458: done getting next task for host managed-node1 44842 1727204512.81462: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44842 1727204512.81466: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204512.81480: getting variables 44842 1727204512.81481: in VariableManager get_vars() 44842 1727204512.81526: Calling all_inventory to load vars for managed-node1 44842 1727204512.81529: Calling groups_inventory to load vars for managed-node1 44842 1727204512.81531: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.81539: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.81541: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.81543: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.82511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.83486: done with get_vars() 44842 1727204512.83502: done getting variables 44842 1727204512.83544: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.039) 0:00:23.003 ***** 44842 1727204512.83577: entering _queue_task() for managed-node1/set_fact 44842 1727204512.83804: worker is 1 (out of 1 available) 44842 1727204512.83816: exiting _queue_task() for managed-node1/set_fact 44842 1727204512.83829: done queuing things up, now waiting for results queue to drain 44842 1727204512.83831: waiting for pending results... 44842 1727204512.84015: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44842 1727204512.84119: in run() - task 0affcd87-79f5-aad0-d242-0000000004f5 44842 1727204512.84129: variable 'ansible_search_path' from source: unknown 44842 1727204512.84132: variable 'ansible_search_path' from source: unknown 44842 1727204512.84164: calling self._execute() 44842 1727204512.84236: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204512.84240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204512.84248: variable 'omit' from source: magic vars 44842 1727204512.84518: variable 'ansible_distribution_major_version' from source: facts 44842 1727204512.84529: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204512.84660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204512.84856: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204512.84901: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204512.84928: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204512.84994: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204512.85056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204512.85079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204512.85100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204512.85117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204512.85182: variable '__network_is_ostree' from source: set_fact 44842 1727204512.85189: Evaluated conditional (not __network_is_ostree is defined): False 44842 1727204512.85193: when evaluation is False, skipping this task 44842 1727204512.85195: _execute() done 44842 1727204512.85197: dumping result to json 44842 1727204512.85201: done dumping result, returning 44842 1727204512.85206: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-aad0-d242-0000000004f5] 44842 1727204512.85216: sending task result for task 0affcd87-79f5-aad0-d242-0000000004f5 44842 1727204512.85296: done sending task result for task 0affcd87-79f5-aad0-d242-0000000004f5 44842 1727204512.85299: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44842 1727204512.85347: no more pending results, returning what we have 44842 1727204512.85351: results queue empty 44842 1727204512.85352: checking for any_errors_fatal 44842 1727204512.85359: done checking for any_errors_fatal 44842 1727204512.85362: checking for max_fail_percentage 44842 1727204512.85365: done checking for max_fail_percentage 44842 1727204512.85366: checking to see if all hosts have failed and the running result is not ok 44842 1727204512.85367: done checking to see if all hosts have failed 44842 1727204512.85368: getting the remaining hosts for this loop 44842 1727204512.85369: done getting the remaining hosts for this loop 44842 1727204512.85373: getting the next task for host managed-node1 44842 1727204512.85383: done getting next task for host managed-node1 44842 1727204512.85387: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44842 1727204512.85390: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204512.85403: getting variables 44842 1727204512.85405: in VariableManager get_vars() 44842 1727204512.85446: Calling all_inventory to load vars for managed-node1 44842 1727204512.85449: Calling groups_inventory to load vars for managed-node1 44842 1727204512.85451: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204512.85462: Calling all_plugins_play to load vars for managed-node1 44842 1727204512.85466: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204512.85469: Calling groups_plugins_play to load vars for managed-node1 44842 1727204512.86303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204512.87373: done with get_vars() 44842 1727204512.87392: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.038) 0:00:23.042 ***** 44842 1727204512.87458: entering _queue_task() for managed-node1/service_facts 44842 1727204512.87687: worker is 1 (out of 1 available) 44842 1727204512.87699: exiting _queue_task() for managed-node1/service_facts 44842 1727204512.87712: done queuing things up, now waiting for results queue to drain 44842 1727204512.87713: waiting for pending results... 44842 1727204512.87904: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 44842 1727204512.87995: in run() - task 0affcd87-79f5-aad0-d242-0000000004f7 44842 1727204512.88005: variable 'ansible_search_path' from source: unknown 44842 1727204512.88009: variable 'ansible_search_path' from source: unknown 44842 1727204512.88040: calling self._execute() 44842 1727204512.88116: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204512.88120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204512.88129: variable 'omit' from source: magic vars 44842 1727204512.88410: variable 'ansible_distribution_major_version' from source: facts 44842 1727204512.88420: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204512.88426: variable 'omit' from source: magic vars 44842 1727204512.88471: variable 'omit' from source: magic vars 44842 1727204512.88498: variable 'omit' from source: magic vars 44842 1727204512.88531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204512.88559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204512.88580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204512.88603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204512.88613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204512.88637: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204512.88641: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204512.88643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204512.88716: Set connection var ansible_shell_type to sh 44842 1727204512.88725: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204512.88730: Set connection var ansible_connection to ssh 44842 1727204512.88735: Set connection var ansible_pipelining to False 44842 1727204512.88741: Set connection var ansible_timeout to 10 44842 1727204512.88747: Set connection var ansible_shell_executable to /bin/sh 44842 1727204512.88766: variable 'ansible_shell_executable' from source: unknown 44842 1727204512.88769: variable 'ansible_connection' from source: unknown 44842 1727204512.88772: variable 'ansible_module_compression' from source: unknown 44842 1727204512.88775: variable 'ansible_shell_type' from source: unknown 44842 1727204512.88777: variable 'ansible_shell_executable' from source: unknown 44842 1727204512.88781: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204512.88784: variable 'ansible_pipelining' from source: unknown 44842 1727204512.88787: variable 'ansible_timeout' from source: unknown 44842 1727204512.88792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204512.88936: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204512.88945: variable 'omit' from source: magic vars 44842 1727204512.88949: starting attempt loop 44842 1727204512.88953: running the handler 44842 1727204512.88966: _low_level_execute_command(): starting 44842 1727204512.88972: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204512.89510: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204512.89525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204512.89540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204512.89556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204512.89580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204512.89613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204512.89626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204512.89701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204512.91316: stdout chunk (state=3): >>>/root <<< 44842 1727204512.91420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204512.91477: stderr chunk (state=3): >>><<< 44842 1727204512.91480: stdout chunk (state=3): >>><<< 44842 1727204512.91501: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204512.91517: _low_level_execute_command(): starting 44842 1727204512.91524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464 `" && echo ansible-tmp-1727204512.915043-46617-183517320207464="` echo /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464 `" ) && sleep 0' 44842 1727204512.91996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204512.92008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204512.92029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204512.92048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204512.92077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204512.92103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204512.92114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204512.92186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204512.94030: stdout chunk (state=3): >>>ansible-tmp-1727204512.915043-46617-183517320207464=/root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464 <<< 44842 1727204512.94241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204512.94246: stdout chunk (state=3): >>><<< 44842 1727204512.94248: stderr chunk (state=3): >>><<< 44842 1727204512.94271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204512.915043-46617-183517320207464=/root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204512.94605: variable 'ansible_module_compression' from source: unknown 44842 1727204512.94608: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44842 1727204512.94611: variable 'ansible_facts' from source: unknown 44842 1727204512.94613: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464/AnsiballZ_service_facts.py 44842 1727204512.95907: Sending initial data 44842 1727204512.95911: Sent initial data (161 bytes) 44842 1727204512.97319: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204512.97333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204512.97347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204512.97371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204512.97419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204512.97431: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204512.97444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204512.97465: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204512.97481: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204512.97498: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204512.97511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204512.97523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204512.97538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204512.97549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204512.97563: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204512.97580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204512.97671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204512.97688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204512.97705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204512.97800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204512.99487: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204512.99540: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204512.99590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp5w9x2xhb /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464/AnsiballZ_service_facts.py <<< 44842 1727204512.99643: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204513.00548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204513.00770: stderr chunk (state=3): >>><<< 44842 1727204513.00773: stdout chunk (state=3): >>><<< 44842 1727204513.00777: done transferring module to remote 44842 1727204513.00779: _low_level_execute_command(): starting 44842 1727204513.00782: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464/ /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464/AnsiballZ_service_facts.py && sleep 0' 44842 1727204513.01425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204513.01446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204513.01466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204513.01486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204513.01527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204513.01539: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204513.01563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204513.01585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204513.01596: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204513.01607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204513.01618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204513.01632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204513.01651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204513.01681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204513.01694: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204513.01707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204513.01797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204513.01818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204513.01833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204513.01922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204513.03616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204513.03698: stderr chunk (state=3): >>><<< 44842 1727204513.03707: stdout chunk (state=3): >>><<< 44842 1727204513.03806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204513.03809: _low_level_execute_command(): starting 44842 1727204513.03812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464/AnsiballZ_service_facts.py && sleep 0' 44842 1727204513.04430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204513.04444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204513.04468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204513.04487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204513.04526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204513.04537: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204513.04549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204513.04577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204513.04588: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204513.04598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204513.04609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204513.04622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204513.04637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204513.04647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204513.04657: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204513.04681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204513.04755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204513.04786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204513.04802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204513.04901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204514.32919: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state<<< 44842 1727204514.32938: stdout chunk (state=3): >>>": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syste<<< 44842 1727204514.32944: stdout chunk (state=3): >>>md-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.se<<< 44842 1727204514.32951: stdout chunk (state=3): >>>rvice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibern<<< 44842 1727204514.32954: stdout chunk (state=3): >>>ate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44842 1727204514.34153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204514.34213: stderr chunk (state=3): >>><<< 44842 1727204514.34216: stdout chunk (state=3): >>><<< 44842 1727204514.34244: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204514.34644: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204514.34653: _low_level_execute_command(): starting 44842 1727204514.34656: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204512.915043-46617-183517320207464/ > /dev/null 2>&1 && sleep 0' 44842 1727204514.35142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.35154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.35180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204514.35192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.35247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204514.35258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204514.35317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204514.37071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204514.37124: stderr chunk (state=3): >>><<< 44842 1727204514.37128: stdout chunk (state=3): >>><<< 44842 1727204514.37147: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204514.37152: handler run complete 44842 1727204514.37257: variable 'ansible_facts' from source: unknown 44842 1727204514.37359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204514.37608: variable 'ansible_facts' from source: unknown 44842 1727204514.37693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204514.37801: attempt loop complete, returning result 44842 1727204514.37805: _execute() done 44842 1727204514.37808: dumping result to json 44842 1727204514.37839: done dumping result, returning 44842 1727204514.37847: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-aad0-d242-0000000004f7] 44842 1727204514.37852: sending task result for task 0affcd87-79f5-aad0-d242-0000000004f7 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204514.38444: no more pending results, returning what we have 44842 1727204514.38447: results queue empty 44842 1727204514.38448: checking for any_errors_fatal 44842 1727204514.38451: done checking for any_errors_fatal 44842 1727204514.38451: checking for max_fail_percentage 44842 1727204514.38453: done checking for max_fail_percentage 44842 1727204514.38453: checking to see if all hosts have failed and the running result is not ok 44842 1727204514.38454: done checking to see if all hosts have failed 44842 1727204514.38454: getting the remaining hosts for this loop 44842 1727204514.38455: done getting the remaining hosts for this loop 44842 1727204514.38458: getting the next task for host managed-node1 44842 1727204514.38463: done getting next task for host managed-node1 44842 1727204514.38467: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44842 1727204514.38469: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204514.38482: done sending task result for task 0affcd87-79f5-aad0-d242-0000000004f7 44842 1727204514.38485: WORKER PROCESS EXITING 44842 1727204514.38489: getting variables 44842 1727204514.38491: in VariableManager get_vars() 44842 1727204514.38515: Calling all_inventory to load vars for managed-node1 44842 1727204514.38517: Calling groups_inventory to load vars for managed-node1 44842 1727204514.38518: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204514.38525: Calling all_plugins_play to load vars for managed-node1 44842 1727204514.38526: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204514.38528: Calling groups_plugins_play to load vars for managed-node1 44842 1727204514.39333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204514.40430: done with get_vars() 44842 1727204514.40450: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:54 -0400 (0:00:01.530) 0:00:24.573 ***** 44842 1727204514.40523: entering _queue_task() for managed-node1/package_facts 44842 1727204514.40771: worker is 1 (out of 1 available) 44842 1727204514.40785: exiting _queue_task() for managed-node1/package_facts 44842 1727204514.40797: done queuing things up, now waiting for results queue to drain 44842 1727204514.40799: waiting for pending results... 44842 1727204514.40989: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 44842 1727204514.41079: in run() - task 0affcd87-79f5-aad0-d242-0000000004f8 44842 1727204514.41093: variable 'ansible_search_path' from source: unknown 44842 1727204514.41097: variable 'ansible_search_path' from source: unknown 44842 1727204514.41128: calling self._execute() 44842 1727204514.41213: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204514.41219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204514.41231: variable 'omit' from source: magic vars 44842 1727204514.41526: variable 'ansible_distribution_major_version' from source: facts 44842 1727204514.41532: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204514.41540: variable 'omit' from source: magic vars 44842 1727204514.41589: variable 'omit' from source: magic vars 44842 1727204514.41613: variable 'omit' from source: magic vars 44842 1727204514.41651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204514.41683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204514.41701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204514.41726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204514.41737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204514.41779: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204514.41789: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204514.41796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204514.41891: Set connection var ansible_shell_type to sh 44842 1727204514.41906: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204514.41915: Set connection var ansible_connection to ssh 44842 1727204514.41924: Set connection var ansible_pipelining to False 44842 1727204514.41933: Set connection var ansible_timeout to 10 44842 1727204514.41943: Set connection var ansible_shell_executable to /bin/sh 44842 1727204514.41973: variable 'ansible_shell_executable' from source: unknown 44842 1727204514.41982: variable 'ansible_connection' from source: unknown 44842 1727204514.41988: variable 'ansible_module_compression' from source: unknown 44842 1727204514.41995: variable 'ansible_shell_type' from source: unknown 44842 1727204514.42002: variable 'ansible_shell_executable' from source: unknown 44842 1727204514.42008: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204514.42015: variable 'ansible_pipelining' from source: unknown 44842 1727204514.42022: variable 'ansible_timeout' from source: unknown 44842 1727204514.42030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204514.42231: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204514.42246: variable 'omit' from source: magic vars 44842 1727204514.42257: starting attempt loop 44842 1727204514.42270: running the handler 44842 1727204514.42289: _low_level_execute_command(): starting 44842 1727204514.42301: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204514.43076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204514.43092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.43107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.43126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.43175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.43189: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204514.43204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.43225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204514.43238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204514.43251: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204514.43270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.43285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.43302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.43313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.43324: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204514.43336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.43448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204514.43506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204514.45040: stdout chunk (state=3): >>>/root <<< 44842 1727204514.45144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204514.45210: stderr chunk (state=3): >>><<< 44842 1727204514.45212: stdout chunk (state=3): >>><<< 44842 1727204514.45252: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204514.45256: _low_level_execute_command(): starting 44842 1727204514.45259: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689 `" && echo ansible-tmp-1727204514.4522707-46680-23799848059689="` echo /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689 `" ) && sleep 0' 44842 1727204514.45829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.45874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.45912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.45921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.45952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204514.45965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204514.46021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204514.47872: stdout chunk (state=3): >>>ansible-tmp-1727204514.4522707-46680-23799848059689=/root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689 <<< 44842 1727204514.48068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204514.48072: stdout chunk (state=3): >>><<< 44842 1727204514.48075: stderr chunk (state=3): >>><<< 44842 1727204514.48373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204514.4522707-46680-23799848059689=/root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204514.48377: variable 'ansible_module_compression' from source: unknown 44842 1727204514.48379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44842 1727204514.48382: variable 'ansible_facts' from source: unknown 44842 1727204514.48506: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689/AnsiballZ_package_facts.py 44842 1727204514.48683: Sending initial data 44842 1727204514.48685: Sent initial data (161 bytes) 44842 1727204514.49781: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204514.49799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.49815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.49834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.49881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.49897: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204514.49913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.49931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204514.49942: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204514.49954: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204514.49972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.49986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.50007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.50021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.50033: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204514.50048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.50133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204514.50149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204514.50168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204514.50345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204514.52004: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204514.52052: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204514.52107: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpgdbeh8b2 /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689/AnsiballZ_package_facts.py <<< 44842 1727204514.52155: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204514.55429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204514.55570: stderr chunk (state=3): >>><<< 44842 1727204514.55574: stdout chunk (state=3): >>><<< 44842 1727204514.55577: done transferring module to remote 44842 1727204514.55579: _low_level_execute_command(): starting 44842 1727204514.55581: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689/ /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689/AnsiballZ_package_facts.py && sleep 0' 44842 1727204514.56957: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204514.57285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.57301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.57319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.57371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.57385: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204514.57401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.57420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204514.57433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204514.57444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204514.57455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.57477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.57493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.57506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.57517: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204514.57529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.57609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204514.57626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204514.57639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204514.57722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204514.59578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204514.59582: stdout chunk (state=3): >>><<< 44842 1727204514.59585: stderr chunk (state=3): >>><<< 44842 1727204514.59587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204514.59589: _low_level_execute_command(): starting 44842 1727204514.59591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689/AnsiballZ_package_facts.py && sleep 0' 44842 1727204514.60833: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204514.61687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.61705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.61724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.61777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.61791: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204514.61806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.61824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204514.61836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204514.61847: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204514.61858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204514.61878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204514.61895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204514.61908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204514.61919: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204514.61932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204514.62014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204514.62044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204514.62056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204514.62154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204515.07849: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 44842 1727204515.08062: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44842 1727204515.09658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204515.09668: stdout chunk (state=3): >>><<< 44842 1727204515.09671: stderr chunk (state=3): >>><<< 44842 1727204515.09773: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204515.13341: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204515.13378: _low_level_execute_command(): starting 44842 1727204515.13389: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204514.4522707-46680-23799848059689/ > /dev/null 2>&1 && sleep 0' 44842 1727204515.14241: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204515.14257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204515.14279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204515.14302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204515.14342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204515.14355: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204515.14375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204515.14392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204515.14406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204515.14421: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204515.14433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204515.14448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204515.14468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204515.14480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204515.14491: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204515.14505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204515.14584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204515.14618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204515.14639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204515.14730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204515.16653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204515.16657: stdout chunk (state=3): >>><<< 44842 1727204515.16675: stderr chunk (state=3): >>><<< 44842 1727204515.16973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204515.16976: handler run complete 44842 1727204515.17736: variable 'ansible_facts' from source: unknown 44842 1727204515.18093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.20076: variable 'ansible_facts' from source: unknown 44842 1727204515.20428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.20872: attempt loop complete, returning result 44842 1727204515.20883: _execute() done 44842 1727204515.20886: dumping result to json 44842 1727204515.21049: done dumping result, returning 44842 1727204515.21063: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-aad0-d242-0000000004f8] 44842 1727204515.21075: sending task result for task 0affcd87-79f5-aad0-d242-0000000004f8 44842 1727204515.23414: done sending task result for task 0affcd87-79f5-aad0-d242-0000000004f8 44842 1727204515.23418: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204515.23586: no more pending results, returning what we have 44842 1727204515.23590: results queue empty 44842 1727204515.23591: checking for any_errors_fatal 44842 1727204515.23597: done checking for any_errors_fatal 44842 1727204515.23597: checking for max_fail_percentage 44842 1727204515.23599: done checking for max_fail_percentage 44842 1727204515.23600: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.23601: done checking to see if all hosts have failed 44842 1727204515.23601: getting the remaining hosts for this loop 44842 1727204515.23603: done getting the remaining hosts for this loop 44842 1727204515.23606: getting the next task for host managed-node1 44842 1727204515.23613: done getting next task for host managed-node1 44842 1727204515.23617: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44842 1727204515.23619: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.23628: getting variables 44842 1727204515.23630: in VariableManager get_vars() 44842 1727204515.23661: Calling all_inventory to load vars for managed-node1 44842 1727204515.23666: Calling groups_inventory to load vars for managed-node1 44842 1727204515.23669: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.23678: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.23680: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.23683: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.24630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.25657: done with get_vars() 44842 1727204515.25679: done getting variables 44842 1727204515.25725: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.852) 0:00:25.425 ***** 44842 1727204515.25755: entering _queue_task() for managed-node1/debug 44842 1727204515.26075: worker is 1 (out of 1 available) 44842 1727204515.26087: exiting _queue_task() for managed-node1/debug 44842 1727204515.26099: done queuing things up, now waiting for results queue to drain 44842 1727204515.26100: waiting for pending results... 44842 1727204515.26389: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 44842 1727204515.26507: in run() - task 0affcd87-79f5-aad0-d242-000000000072 44842 1727204515.26526: variable 'ansible_search_path' from source: unknown 44842 1727204515.26533: variable 'ansible_search_path' from source: unknown 44842 1727204515.26585: calling self._execute() 44842 1727204515.26700: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.26710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.26724: variable 'omit' from source: magic vars 44842 1727204515.27108: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.27127: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204515.27138: variable 'omit' from source: magic vars 44842 1727204515.27180: variable 'omit' from source: magic vars 44842 1727204515.27280: variable 'network_provider' from source: set_fact 44842 1727204515.27301: variable 'omit' from source: magic vars 44842 1727204515.27349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204515.27396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204515.27423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204515.27445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204515.27461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204515.27503: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204515.27512: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.27520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.27624: Set connection var ansible_shell_type to sh 44842 1727204515.27642: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204515.27652: Set connection var ansible_connection to ssh 44842 1727204515.27662: Set connection var ansible_pipelining to False 44842 1727204515.27676: Set connection var ansible_timeout to 10 44842 1727204515.27692: Set connection var ansible_shell_executable to /bin/sh 44842 1727204515.27718: variable 'ansible_shell_executable' from source: unknown 44842 1727204515.27726: variable 'ansible_connection' from source: unknown 44842 1727204515.27732: variable 'ansible_module_compression' from source: unknown 44842 1727204515.27739: variable 'ansible_shell_type' from source: unknown 44842 1727204515.27744: variable 'ansible_shell_executable' from source: unknown 44842 1727204515.27750: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.27756: variable 'ansible_pipelining' from source: unknown 44842 1727204515.27761: variable 'ansible_timeout' from source: unknown 44842 1727204515.27769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.27919: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204515.27939: variable 'omit' from source: magic vars 44842 1727204515.27948: starting attempt loop 44842 1727204515.27954: running the handler 44842 1727204515.28006: handler run complete 44842 1727204515.28027: attempt loop complete, returning result 44842 1727204515.28035: _execute() done 44842 1727204515.28043: dumping result to json 44842 1727204515.28050: done dumping result, returning 44842 1727204515.28060: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-aad0-d242-000000000072] 44842 1727204515.28072: sending task result for task 0affcd87-79f5-aad0-d242-000000000072 ok: [managed-node1] => {} MSG: Using network provider: nm 44842 1727204515.28227: no more pending results, returning what we have 44842 1727204515.28231: results queue empty 44842 1727204515.28232: checking for any_errors_fatal 44842 1727204515.28241: done checking for any_errors_fatal 44842 1727204515.28242: checking for max_fail_percentage 44842 1727204515.28244: done checking for max_fail_percentage 44842 1727204515.28245: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.28246: done checking to see if all hosts have failed 44842 1727204515.28246: getting the remaining hosts for this loop 44842 1727204515.28248: done getting the remaining hosts for this loop 44842 1727204515.28252: getting the next task for host managed-node1 44842 1727204515.28260: done getting next task for host managed-node1 44842 1727204515.28267: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44842 1727204515.28269: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.28279: getting variables 44842 1727204515.28281: in VariableManager get_vars() 44842 1727204515.28322: Calling all_inventory to load vars for managed-node1 44842 1727204515.28325: Calling groups_inventory to load vars for managed-node1 44842 1727204515.28328: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.28339: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.28343: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.28346: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.29275: done sending task result for task 0affcd87-79f5-aad0-d242-000000000072 44842 1727204515.29279: WORKER PROCESS EXITING 44842 1727204515.29707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.31877: done with get_vars() 44842 1727204515.31905: done getting variables 44842 1727204515.31951: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.062) 0:00:25.487 ***** 44842 1727204515.31978: entering _queue_task() for managed-node1/fail 44842 1727204515.32222: worker is 1 (out of 1 available) 44842 1727204515.32235: exiting _queue_task() for managed-node1/fail 44842 1727204515.32247: done queuing things up, now waiting for results queue to drain 44842 1727204515.32248: waiting for pending results... 44842 1727204515.32436: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44842 1727204515.32514: in run() - task 0affcd87-79f5-aad0-d242-000000000073 44842 1727204515.32524: variable 'ansible_search_path' from source: unknown 44842 1727204515.32528: variable 'ansible_search_path' from source: unknown 44842 1727204515.32558: calling self._execute() 44842 1727204515.32638: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.32641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.32651: variable 'omit' from source: magic vars 44842 1727204515.32943: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.32954: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204515.33043: variable 'network_state' from source: role '' defaults 44842 1727204515.33051: Evaluated conditional (network_state != {}): False 44842 1727204515.33054: when evaluation is False, skipping this task 44842 1727204515.33057: _execute() done 44842 1727204515.33059: dumping result to json 44842 1727204515.33066: done dumping result, returning 44842 1727204515.33073: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-aad0-d242-000000000073] 44842 1727204515.33079: sending task result for task 0affcd87-79f5-aad0-d242-000000000073 44842 1727204515.33171: done sending task result for task 0affcd87-79f5-aad0-d242-000000000073 44842 1727204515.33175: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204515.33217: no more pending results, returning what we have 44842 1727204515.33221: results queue empty 44842 1727204515.33222: checking for any_errors_fatal 44842 1727204515.33231: done checking for any_errors_fatal 44842 1727204515.33231: checking for max_fail_percentage 44842 1727204515.33233: done checking for max_fail_percentage 44842 1727204515.33234: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.33235: done checking to see if all hosts have failed 44842 1727204515.33236: getting the remaining hosts for this loop 44842 1727204515.33237: done getting the remaining hosts for this loop 44842 1727204515.33242: getting the next task for host managed-node1 44842 1727204515.33248: done getting next task for host managed-node1 44842 1727204515.33252: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44842 1727204515.33254: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.33270: getting variables 44842 1727204515.33272: in VariableManager get_vars() 44842 1727204515.33313: Calling all_inventory to load vars for managed-node1 44842 1727204515.33316: Calling groups_inventory to load vars for managed-node1 44842 1727204515.33318: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.33327: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.33329: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.33331: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.34295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.35236: done with get_vars() 44842 1727204515.35255: done getting variables 44842 1727204515.35301: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.033) 0:00:25.521 ***** 44842 1727204515.35323: entering _queue_task() for managed-node1/fail 44842 1727204515.35560: worker is 1 (out of 1 available) 44842 1727204515.35574: exiting _queue_task() for managed-node1/fail 44842 1727204515.35587: done queuing things up, now waiting for results queue to drain 44842 1727204515.35588: waiting for pending results... 44842 1727204515.35776: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44842 1727204515.35851: in run() - task 0affcd87-79f5-aad0-d242-000000000074 44842 1727204515.35861: variable 'ansible_search_path' from source: unknown 44842 1727204515.35876: variable 'ansible_search_path' from source: unknown 44842 1727204515.35908: calling self._execute() 44842 1727204515.35987: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.35991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.36000: variable 'omit' from source: magic vars 44842 1727204515.36291: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.36301: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204515.36388: variable 'network_state' from source: role '' defaults 44842 1727204515.36398: Evaluated conditional (network_state != {}): False 44842 1727204515.36401: when evaluation is False, skipping this task 44842 1727204515.36403: _execute() done 44842 1727204515.36406: dumping result to json 44842 1727204515.36408: done dumping result, returning 44842 1727204515.36415: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-aad0-d242-000000000074] 44842 1727204515.36421: sending task result for task 0affcd87-79f5-aad0-d242-000000000074 44842 1727204515.36513: done sending task result for task 0affcd87-79f5-aad0-d242-000000000074 44842 1727204515.36516: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204515.36569: no more pending results, returning what we have 44842 1727204515.36573: results queue empty 44842 1727204515.36574: checking for any_errors_fatal 44842 1727204515.36582: done checking for any_errors_fatal 44842 1727204515.36582: checking for max_fail_percentage 44842 1727204515.36584: done checking for max_fail_percentage 44842 1727204515.36585: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.36586: done checking to see if all hosts have failed 44842 1727204515.36587: getting the remaining hosts for this loop 44842 1727204515.36589: done getting the remaining hosts for this loop 44842 1727204515.36593: getting the next task for host managed-node1 44842 1727204515.36600: done getting next task for host managed-node1 44842 1727204515.36604: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44842 1727204515.36605: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.36620: getting variables 44842 1727204515.36621: in VariableManager get_vars() 44842 1727204515.36662: Calling all_inventory to load vars for managed-node1 44842 1727204515.36666: Calling groups_inventory to load vars for managed-node1 44842 1727204515.36668: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.36677: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.36680: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.36682: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.37507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.38489: done with get_vars() 44842 1727204515.38514: done getting variables 44842 1727204515.38559: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.032) 0:00:25.553 ***** 44842 1727204515.38586: entering _queue_task() for managed-node1/fail 44842 1727204515.38832: worker is 1 (out of 1 available) 44842 1727204515.38845: exiting _queue_task() for managed-node1/fail 44842 1727204515.38858: done queuing things up, now waiting for results queue to drain 44842 1727204515.38862: waiting for pending results... 44842 1727204515.39048: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44842 1727204515.39122: in run() - task 0affcd87-79f5-aad0-d242-000000000075 44842 1727204515.39133: variable 'ansible_search_path' from source: unknown 44842 1727204515.39137: variable 'ansible_search_path' from source: unknown 44842 1727204515.39169: calling self._execute() 44842 1727204515.39253: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.39257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.39271: variable 'omit' from source: magic vars 44842 1727204515.39630: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.39641: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204515.39775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204515.41982: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204515.42091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204515.42142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204515.42186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204515.42218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204515.42307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.42375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.42406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.42444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.42459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.42537: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.42551: Evaluated conditional (ansible_distribution_major_version | int > 9): False 44842 1727204515.42554: when evaluation is False, skipping this task 44842 1727204515.42561: _execute() done 44842 1727204515.42569: dumping result to json 44842 1727204515.42576: done dumping result, returning 44842 1727204515.42583: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-aad0-d242-000000000075] 44842 1727204515.42588: sending task result for task 0affcd87-79f5-aad0-d242-000000000075 44842 1727204515.42680: done sending task result for task 0affcd87-79f5-aad0-d242-000000000075 44842 1727204515.42683: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 44842 1727204515.42725: no more pending results, returning what we have 44842 1727204515.42729: results queue empty 44842 1727204515.42730: checking for any_errors_fatal 44842 1727204515.42735: done checking for any_errors_fatal 44842 1727204515.42736: checking for max_fail_percentage 44842 1727204515.42738: done checking for max_fail_percentage 44842 1727204515.42739: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.42740: done checking to see if all hosts have failed 44842 1727204515.42740: getting the remaining hosts for this loop 44842 1727204515.42743: done getting the remaining hosts for this loop 44842 1727204515.42747: getting the next task for host managed-node1 44842 1727204515.42753: done getting next task for host managed-node1 44842 1727204515.42757: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44842 1727204515.42759: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.42776: getting variables 44842 1727204515.42778: in VariableManager get_vars() 44842 1727204515.42816: Calling all_inventory to load vars for managed-node1 44842 1727204515.42819: Calling groups_inventory to load vars for managed-node1 44842 1727204515.42822: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.42831: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.42833: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.42836: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.43839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.45332: done with get_vars() 44842 1727204515.45361: done getting variables 44842 1727204515.45422: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.068) 0:00:25.622 ***** 44842 1727204515.45454: entering _queue_task() for managed-node1/dnf 44842 1727204515.45779: worker is 1 (out of 1 available) 44842 1727204515.45791: exiting _queue_task() for managed-node1/dnf 44842 1727204515.45803: done queuing things up, now waiting for results queue to drain 44842 1727204515.45804: waiting for pending results... 44842 1727204515.46426: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44842 1727204515.46562: in run() - task 0affcd87-79f5-aad0-d242-000000000076 44842 1727204515.46585: variable 'ansible_search_path' from source: unknown 44842 1727204515.46592: variable 'ansible_search_path' from source: unknown 44842 1727204515.46632: calling self._execute() 44842 1727204515.46737: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.46751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.46774: variable 'omit' from source: magic vars 44842 1727204515.47176: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.47194: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204515.47402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204515.50696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204515.50775: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204515.50823: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204515.50889: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204515.50922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204515.51022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.51058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.51094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.51142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.51161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.51311: variable 'ansible_distribution' from source: facts 44842 1727204515.51321: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.51341: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44842 1727204515.51461: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204515.51716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.51746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.51776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.51856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.51878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.51925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.51958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.51991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.52039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.52069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.52115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.52144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.52245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.52295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.52348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.52563: variable 'network_connections' from source: play vars 44842 1727204515.52585: variable 'profile' from source: play vars 44842 1727204515.52666: variable 'profile' from source: play vars 44842 1727204515.52677: variable 'interface' from source: set_fact 44842 1727204515.52748: variable 'interface' from source: set_fact 44842 1727204515.52887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204515.53158: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204515.53207: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204515.53244: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204515.53303: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204515.53466: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204515.53538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204515.53650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.53684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204515.53735: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204515.54089: variable 'network_connections' from source: play vars 44842 1727204515.54101: variable 'profile' from source: play vars 44842 1727204515.54170: variable 'profile' from source: play vars 44842 1727204515.54184: variable 'interface' from source: set_fact 44842 1727204515.54246: variable 'interface' from source: set_fact 44842 1727204515.54317: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204515.54351: when evaluation is False, skipping this task 44842 1727204515.54359: _execute() done 44842 1727204515.54369: dumping result to json 44842 1727204515.54377: done dumping result, returning 44842 1727204515.54388: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-000000000076] 44842 1727204515.54427: sending task result for task 0affcd87-79f5-aad0-d242-000000000076 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204515.54584: no more pending results, returning what we have 44842 1727204515.54589: results queue empty 44842 1727204515.54590: checking for any_errors_fatal 44842 1727204515.54598: done checking for any_errors_fatal 44842 1727204515.54599: checking for max_fail_percentage 44842 1727204515.54601: done checking for max_fail_percentage 44842 1727204515.54602: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.54603: done checking to see if all hosts have failed 44842 1727204515.54604: getting the remaining hosts for this loop 44842 1727204515.54606: done getting the remaining hosts for this loop 44842 1727204515.54610: getting the next task for host managed-node1 44842 1727204515.54617: done getting next task for host managed-node1 44842 1727204515.54621: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44842 1727204515.54623: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.54638: getting variables 44842 1727204515.54640: in VariableManager get_vars() 44842 1727204515.54686: Calling all_inventory to load vars for managed-node1 44842 1727204515.54689: Calling groups_inventory to load vars for managed-node1 44842 1727204515.54692: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.54703: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.54706: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.54709: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.55947: done sending task result for task 0affcd87-79f5-aad0-d242-000000000076 44842 1727204515.55951: WORKER PROCESS EXITING 44842 1727204515.56543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.58653: done with get_vars() 44842 1727204515.58691: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44842 1727204515.58774: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.133) 0:00:25.756 ***** 44842 1727204515.58802: entering _queue_task() for managed-node1/yum 44842 1727204515.59116: worker is 1 (out of 1 available) 44842 1727204515.59131: exiting _queue_task() for managed-node1/yum 44842 1727204515.59143: done queuing things up, now waiting for results queue to drain 44842 1727204515.59145: waiting for pending results... 44842 1727204515.59425: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44842 1727204515.59539: in run() - task 0affcd87-79f5-aad0-d242-000000000077 44842 1727204515.59558: variable 'ansible_search_path' from source: unknown 44842 1727204515.59568: variable 'ansible_search_path' from source: unknown 44842 1727204515.59611: calling self._execute() 44842 1727204515.59725: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.59739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.59756: variable 'omit' from source: magic vars 44842 1727204515.60367: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.60387: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204515.60762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204515.68739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204515.68823: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204515.68967: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204515.69011: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204515.69157: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204515.69241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.69282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.69446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.69597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.69618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.69831: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.69852: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44842 1727204515.69861: when evaluation is False, skipping this task 44842 1727204515.69872: _execute() done 44842 1727204515.69884: dumping result to json 44842 1727204515.69894: done dumping result, returning 44842 1727204515.69909: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-000000000077] 44842 1727204515.69918: sending task result for task 0affcd87-79f5-aad0-d242-000000000077 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44842 1727204515.70090: no more pending results, returning what we have 44842 1727204515.70095: results queue empty 44842 1727204515.70096: checking for any_errors_fatal 44842 1727204515.70103: done checking for any_errors_fatal 44842 1727204515.70104: checking for max_fail_percentage 44842 1727204515.70106: done checking for max_fail_percentage 44842 1727204515.70107: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.70108: done checking to see if all hosts have failed 44842 1727204515.70109: getting the remaining hosts for this loop 44842 1727204515.70111: done getting the remaining hosts for this loop 44842 1727204515.70116: getting the next task for host managed-node1 44842 1727204515.70124: done getting next task for host managed-node1 44842 1727204515.70128: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44842 1727204515.70131: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.70147: getting variables 44842 1727204515.70148: in VariableManager get_vars() 44842 1727204515.70191: Calling all_inventory to load vars for managed-node1 44842 1727204515.70194: Calling groups_inventory to load vars for managed-node1 44842 1727204515.70197: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.70209: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.70211: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.70215: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.71188: done sending task result for task 0affcd87-79f5-aad0-d242-000000000077 44842 1727204515.71191: WORKER PROCESS EXITING 44842 1727204515.73193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.78081: done with get_vars() 44842 1727204515.78116: done getting variables 44842 1727204515.78486: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.197) 0:00:25.953 ***** 44842 1727204515.78521: entering _queue_task() for managed-node1/fail 44842 1727204515.79158: worker is 1 (out of 1 available) 44842 1727204515.79172: exiting _queue_task() for managed-node1/fail 44842 1727204515.79184: done queuing things up, now waiting for results queue to drain 44842 1727204515.79185: waiting for pending results... 44842 1727204515.81581: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44842 1727204515.81681: in run() - task 0affcd87-79f5-aad0-d242-000000000078 44842 1727204515.81698: variable 'ansible_search_path' from source: unknown 44842 1727204515.81701: variable 'ansible_search_path' from source: unknown 44842 1727204515.81738: calling self._execute() 44842 1727204515.81835: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.81839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.81850: variable 'omit' from source: magic vars 44842 1727204515.82716: variable 'ansible_distribution_major_version' from source: facts 44842 1727204515.82738: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204515.82879: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204515.83334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204515.87207: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204515.87292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204515.87338: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204515.87388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204515.87499: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204515.87682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.87756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.87842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.87912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.87939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.87996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.88025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.88055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.88203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.88222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.88325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204515.88354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204515.88388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.88432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204515.88452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204515.88647: variable 'network_connections' from source: play vars 44842 1727204515.88787: variable 'profile' from source: play vars 44842 1727204515.88872: variable 'profile' from source: play vars 44842 1727204515.88977: variable 'interface' from source: set_fact 44842 1727204515.89045: variable 'interface' from source: set_fact 44842 1727204515.89236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204515.89422: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204515.89471: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204515.89507: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204515.89601: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204515.89658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204515.89752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204515.89788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204515.89820: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204515.89876: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204515.90137: variable 'network_connections' from source: play vars 44842 1727204515.90210: variable 'profile' from source: play vars 44842 1727204515.90289: variable 'profile' from source: play vars 44842 1727204515.90298: variable 'interface' from source: set_fact 44842 1727204515.90365: variable 'interface' from source: set_fact 44842 1727204515.90402: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204515.90411: when evaluation is False, skipping this task 44842 1727204515.90419: _execute() done 44842 1727204515.90426: dumping result to json 44842 1727204515.90441: done dumping result, returning 44842 1727204515.90453: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-000000000078] 44842 1727204515.90484: sending task result for task 0affcd87-79f5-aad0-d242-000000000078 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204515.90636: no more pending results, returning what we have 44842 1727204515.90640: results queue empty 44842 1727204515.90641: checking for any_errors_fatal 44842 1727204515.90648: done checking for any_errors_fatal 44842 1727204515.90649: checking for max_fail_percentage 44842 1727204515.90651: done checking for max_fail_percentage 44842 1727204515.90652: checking to see if all hosts have failed and the running result is not ok 44842 1727204515.90653: done checking to see if all hosts have failed 44842 1727204515.90653: getting the remaining hosts for this loop 44842 1727204515.90655: done getting the remaining hosts for this loop 44842 1727204515.90658: getting the next task for host managed-node1 44842 1727204515.90667: done getting next task for host managed-node1 44842 1727204515.90672: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44842 1727204515.90673: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204515.90686: getting variables 44842 1727204515.90688: in VariableManager get_vars() 44842 1727204515.90726: Calling all_inventory to load vars for managed-node1 44842 1727204515.90729: Calling groups_inventory to load vars for managed-node1 44842 1727204515.90731: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204515.90740: Calling all_plugins_play to load vars for managed-node1 44842 1727204515.90742: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204515.90744: Calling groups_plugins_play to load vars for managed-node1 44842 1727204515.91641: done sending task result for task 0affcd87-79f5-aad0-d242-000000000078 44842 1727204515.91644: WORKER PROCESS EXITING 44842 1727204515.93615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204515.97069: done with get_vars() 44842 1727204515.97100: done getting variables 44842 1727204515.97165: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.186) 0:00:26.140 ***** 44842 1727204515.97198: entering _queue_task() for managed-node1/package 44842 1727204515.98042: worker is 1 (out of 1 available) 44842 1727204515.98056: exiting _queue_task() for managed-node1/package 44842 1727204515.98074: done queuing things up, now waiting for results queue to drain 44842 1727204515.98076: waiting for pending results... 44842 1727204515.99024: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 44842 1727204515.99365: in run() - task 0affcd87-79f5-aad0-d242-000000000079 44842 1727204515.99388: variable 'ansible_search_path' from source: unknown 44842 1727204515.99406: variable 'ansible_search_path' from source: unknown 44842 1727204515.99445: calling self._execute() 44842 1727204515.99718: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204515.99733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204515.99750: variable 'omit' from source: magic vars 44842 1727204516.00514: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.00578: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204516.01021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204516.01686: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204516.01738: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204516.01842: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204516.02063: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204516.02293: variable 'network_packages' from source: role '' defaults 44842 1727204516.02526: variable '__network_provider_setup' from source: role '' defaults 44842 1727204516.02584: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204516.02658: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204516.02801: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204516.02867: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204516.03193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204516.06281: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204516.06354: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204516.06393: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204516.06432: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204516.06462: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204516.06553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.06588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.06614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.06660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.06682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.06736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.06772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.06802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.06847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.06868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.07112: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44842 1727204516.07501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.07531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.07570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.07613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.07633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.07734: variable 'ansible_python' from source: facts 44842 1727204516.07769: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44842 1727204516.07865: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204516.08076: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204516.08385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.08414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.08577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.08623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.08643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.08700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.08792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.08899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.08942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.08999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.09282: variable 'network_connections' from source: play vars 44842 1727204516.09474: variable 'profile' from source: play vars 44842 1727204516.09614: variable 'profile' from source: play vars 44842 1727204516.09638: variable 'interface' from source: set_fact 44842 1727204516.09725: variable 'interface' from source: set_fact 44842 1727204516.09824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204516.09857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204516.09896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.09927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204516.09978: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204516.10292: variable 'network_connections' from source: play vars 44842 1727204516.10308: variable 'profile' from source: play vars 44842 1727204516.10424: variable 'profile' from source: play vars 44842 1727204516.10436: variable 'interface' from source: set_fact 44842 1727204516.10512: variable 'interface' from source: set_fact 44842 1727204516.10555: variable '__network_packages_default_wireless' from source: role '' defaults 44842 1727204516.10646: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204516.10986: variable 'network_connections' from source: play vars 44842 1727204516.10996: variable 'profile' from source: play vars 44842 1727204516.11064: variable 'profile' from source: play vars 44842 1727204516.11077: variable 'interface' from source: set_fact 44842 1727204516.11183: variable 'interface' from source: set_fact 44842 1727204516.11214: variable '__network_packages_default_team' from source: role '' defaults 44842 1727204516.11303: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204516.11715: variable 'network_connections' from source: play vars 44842 1727204516.12197: variable 'profile' from source: play vars 44842 1727204516.12270: variable 'profile' from source: play vars 44842 1727204516.12377: variable 'interface' from source: set_fact 44842 1727204516.12490: variable 'interface' from source: set_fact 44842 1727204516.12748: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204516.12891: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204516.12904: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204516.13084: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204516.13521: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44842 1727204516.14197: variable 'network_connections' from source: play vars 44842 1727204516.14260: variable 'profile' from source: play vars 44842 1727204516.14322: variable 'profile' from source: play vars 44842 1727204516.14487: variable 'interface' from source: set_fact 44842 1727204516.14636: variable 'interface' from source: set_fact 44842 1727204516.14682: variable 'ansible_distribution' from source: facts 44842 1727204516.14778: variable '__network_rh_distros' from source: role '' defaults 44842 1727204516.14790: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.14816: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44842 1727204516.15094: variable 'ansible_distribution' from source: facts 44842 1727204516.15245: variable '__network_rh_distros' from source: role '' defaults 44842 1727204516.15256: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.15277: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44842 1727204516.15676: variable 'ansible_distribution' from source: facts 44842 1727204516.15686: variable '__network_rh_distros' from source: role '' defaults 44842 1727204516.15696: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.15739: variable 'network_provider' from source: set_fact 44842 1727204516.15761: variable 'ansible_facts' from source: unknown 44842 1727204516.17362: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44842 1727204516.17372: when evaluation is False, skipping this task 44842 1727204516.17381: _execute() done 44842 1727204516.17388: dumping result to json 44842 1727204516.17396: done dumping result, returning 44842 1727204516.17408: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-aad0-d242-000000000079] 44842 1727204516.17423: sending task result for task 0affcd87-79f5-aad0-d242-000000000079 skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44842 1727204516.17584: no more pending results, returning what we have 44842 1727204516.17589: results queue empty 44842 1727204516.17590: checking for any_errors_fatal 44842 1727204516.17598: done checking for any_errors_fatal 44842 1727204516.17599: checking for max_fail_percentage 44842 1727204516.17601: done checking for max_fail_percentage 44842 1727204516.17602: checking to see if all hosts have failed and the running result is not ok 44842 1727204516.17603: done checking to see if all hosts have failed 44842 1727204516.17604: getting the remaining hosts for this loop 44842 1727204516.17606: done getting the remaining hosts for this loop 44842 1727204516.17611: getting the next task for host managed-node1 44842 1727204516.17619: done getting next task for host managed-node1 44842 1727204516.17624: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44842 1727204516.17627: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204516.17644: getting variables 44842 1727204516.17646: in VariableManager get_vars() 44842 1727204516.17695: Calling all_inventory to load vars for managed-node1 44842 1727204516.17698: Calling groups_inventory to load vars for managed-node1 44842 1727204516.17701: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204516.17712: Calling all_plugins_play to load vars for managed-node1 44842 1727204516.17720: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204516.17724: Calling groups_plugins_play to load vars for managed-node1 44842 1727204516.18891: done sending task result for task 0affcd87-79f5-aad0-d242-000000000079 44842 1727204516.18895: WORKER PROCESS EXITING 44842 1727204516.20035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204516.21715: done with get_vars() 44842 1727204516.21738: done getting variables 44842 1727204516.21786: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:56 -0400 (0:00:00.246) 0:00:26.386 ***** 44842 1727204516.21811: entering _queue_task() for managed-node1/package 44842 1727204516.22044: worker is 1 (out of 1 available) 44842 1727204516.22059: exiting _queue_task() for managed-node1/package 44842 1727204516.22073: done queuing things up, now waiting for results queue to drain 44842 1727204516.22074: waiting for pending results... 44842 1727204516.22273: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44842 1727204516.22391: in run() - task 0affcd87-79f5-aad0-d242-00000000007a 44842 1727204516.22417: variable 'ansible_search_path' from source: unknown 44842 1727204516.22425: variable 'ansible_search_path' from source: unknown 44842 1727204516.22475: calling self._execute() 44842 1727204516.22581: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204516.22592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204516.22607: variable 'omit' from source: magic vars 44842 1727204516.23064: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.23086: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204516.23338: variable 'network_state' from source: role '' defaults 44842 1727204516.23360: Evaluated conditional (network_state != {}): False 44842 1727204516.23370: when evaluation is False, skipping this task 44842 1727204516.23379: _execute() done 44842 1727204516.23387: dumping result to json 44842 1727204516.23395: done dumping result, returning 44842 1727204516.23405: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-aad0-d242-00000000007a] 44842 1727204516.23418: sending task result for task 0affcd87-79f5-aad0-d242-00000000007a skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204516.23596: no more pending results, returning what we have 44842 1727204516.23601: results queue empty 44842 1727204516.23602: checking for any_errors_fatal 44842 1727204516.23611: done checking for any_errors_fatal 44842 1727204516.23612: checking for max_fail_percentage 44842 1727204516.23613: done checking for max_fail_percentage 44842 1727204516.23614: checking to see if all hosts have failed and the running result is not ok 44842 1727204516.23615: done checking to see if all hosts have failed 44842 1727204516.23616: getting the remaining hosts for this loop 44842 1727204516.23618: done getting the remaining hosts for this loop 44842 1727204516.23622: getting the next task for host managed-node1 44842 1727204516.23629: done getting next task for host managed-node1 44842 1727204516.23633: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44842 1727204516.23636: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204516.23653: getting variables 44842 1727204516.23655: in VariableManager get_vars() 44842 1727204516.23710: Calling all_inventory to load vars for managed-node1 44842 1727204516.23713: Calling groups_inventory to load vars for managed-node1 44842 1727204516.23716: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204516.23728: Calling all_plugins_play to load vars for managed-node1 44842 1727204516.23730: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204516.23733: Calling groups_plugins_play to load vars for managed-node1 44842 1727204516.24324: done sending task result for task 0affcd87-79f5-aad0-d242-00000000007a 44842 1727204516.24327: WORKER PROCESS EXITING 44842 1727204516.25771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204516.28612: done with get_vars() 44842 1727204516.28644: done getting variables 44842 1727204516.28712: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:56 -0400 (0:00:00.069) 0:00:26.455 ***** 44842 1727204516.28748: entering _queue_task() for managed-node1/package 44842 1727204516.29087: worker is 1 (out of 1 available) 44842 1727204516.29099: exiting _queue_task() for managed-node1/package 44842 1727204516.29111: done queuing things up, now waiting for results queue to drain 44842 1727204516.29112: waiting for pending results... 44842 1727204516.29643: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44842 1727204516.29760: in run() - task 0affcd87-79f5-aad0-d242-00000000007b 44842 1727204516.29790: variable 'ansible_search_path' from source: unknown 44842 1727204516.29800: variable 'ansible_search_path' from source: unknown 44842 1727204516.29840: calling self._execute() 44842 1727204516.29946: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204516.29957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204516.29975: variable 'omit' from source: magic vars 44842 1727204516.30373: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.30391: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204516.30525: variable 'network_state' from source: role '' defaults 44842 1727204516.30547: Evaluated conditional (network_state != {}): False 44842 1727204516.30555: when evaluation is False, skipping this task 44842 1727204516.30571: _execute() done 44842 1727204516.30578: dumping result to json 44842 1727204516.30586: done dumping result, returning 44842 1727204516.30595: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-aad0-d242-00000000007b] 44842 1727204516.30605: sending task result for task 0affcd87-79f5-aad0-d242-00000000007b 44842 1727204516.30727: done sending task result for task 0affcd87-79f5-aad0-d242-00000000007b 44842 1727204516.30734: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204516.30798: no more pending results, returning what we have 44842 1727204516.30802: results queue empty 44842 1727204516.30803: checking for any_errors_fatal 44842 1727204516.30812: done checking for any_errors_fatal 44842 1727204516.30813: checking for max_fail_percentage 44842 1727204516.30815: done checking for max_fail_percentage 44842 1727204516.30816: checking to see if all hosts have failed and the running result is not ok 44842 1727204516.30817: done checking to see if all hosts have failed 44842 1727204516.30818: getting the remaining hosts for this loop 44842 1727204516.30820: done getting the remaining hosts for this loop 44842 1727204516.30825: getting the next task for host managed-node1 44842 1727204516.30832: done getting next task for host managed-node1 44842 1727204516.30837: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44842 1727204516.30840: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204516.30858: getting variables 44842 1727204516.30859: in VariableManager get_vars() 44842 1727204516.30898: Calling all_inventory to load vars for managed-node1 44842 1727204516.30902: Calling groups_inventory to load vars for managed-node1 44842 1727204516.30904: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204516.30917: Calling all_plugins_play to load vars for managed-node1 44842 1727204516.30919: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204516.30922: Calling groups_plugins_play to load vars for managed-node1 44842 1727204516.33428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204516.35653: done with get_vars() 44842 1727204516.35693: done getting variables 44842 1727204516.35761: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:56 -0400 (0:00:00.070) 0:00:26.526 ***** 44842 1727204516.35795: entering _queue_task() for managed-node1/service 44842 1727204516.36150: worker is 1 (out of 1 available) 44842 1727204516.36163: exiting _queue_task() for managed-node1/service 44842 1727204516.36890: done queuing things up, now waiting for results queue to drain 44842 1727204516.36892: waiting for pending results... 44842 1727204516.37205: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44842 1727204516.37343: in run() - task 0affcd87-79f5-aad0-d242-00000000007c 44842 1727204516.37366: variable 'ansible_search_path' from source: unknown 44842 1727204516.37377: variable 'ansible_search_path' from source: unknown 44842 1727204516.37424: calling self._execute() 44842 1727204516.37535: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204516.37552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204516.37571: variable 'omit' from source: magic vars 44842 1727204516.37983: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.38003: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204516.38133: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204516.38352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204516.52585: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204516.52659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204516.52702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204516.52738: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204516.52774: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204516.52847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.52898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.52929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.52979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.52999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.53046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.53075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.53108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.53151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.53171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.53227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.53256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.53291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.53337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.53357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.53575: variable 'network_connections' from source: play vars 44842 1727204516.53591: variable 'profile' from source: play vars 44842 1727204516.53670: variable 'profile' from source: play vars 44842 1727204516.53680: variable 'interface' from source: set_fact 44842 1727204516.53745: variable 'interface' from source: set_fact 44842 1727204516.53822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204516.53989: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204516.54031: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204516.54067: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204516.54105: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204516.54150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204516.54178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204516.54210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.54241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204516.54286: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204516.54554: variable 'network_connections' from source: play vars 44842 1727204516.54569: variable 'profile' from source: play vars 44842 1727204516.54634: variable 'profile' from source: play vars 44842 1727204516.54647: variable 'interface' from source: set_fact 44842 1727204516.54710: variable 'interface' from source: set_fact 44842 1727204516.54740: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204516.54753: when evaluation is False, skipping this task 44842 1727204516.54759: _execute() done 44842 1727204516.54768: dumping result to json 44842 1727204516.54774: done dumping result, returning 44842 1727204516.54785: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-00000000007c] 44842 1727204516.54800: sending task result for task 0affcd87-79f5-aad0-d242-00000000007c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204516.54940: no more pending results, returning what we have 44842 1727204516.54944: results queue empty 44842 1727204516.54945: checking for any_errors_fatal 44842 1727204516.54953: done checking for any_errors_fatal 44842 1727204516.54953: checking for max_fail_percentage 44842 1727204516.54955: done checking for max_fail_percentage 44842 1727204516.54956: checking to see if all hosts have failed and the running result is not ok 44842 1727204516.54957: done checking to see if all hosts have failed 44842 1727204516.54958: getting the remaining hosts for this loop 44842 1727204516.54960: done getting the remaining hosts for this loop 44842 1727204516.54963: getting the next task for host managed-node1 44842 1727204516.54971: done getting next task for host managed-node1 44842 1727204516.54976: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44842 1727204516.54978: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204516.54991: getting variables 44842 1727204516.54993: in VariableManager get_vars() 44842 1727204516.55030: Calling all_inventory to load vars for managed-node1 44842 1727204516.55033: Calling groups_inventory to load vars for managed-node1 44842 1727204516.55035: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204516.55044: Calling all_plugins_play to load vars for managed-node1 44842 1727204516.55046: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204516.55049: Calling groups_plugins_play to load vars for managed-node1 44842 1727204516.56082: done sending task result for task 0affcd87-79f5-aad0-d242-00000000007c 44842 1727204516.56086: WORKER PROCESS EXITING 44842 1727204516.61890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204516.63771: done with get_vars() 44842 1727204516.63809: done getting variables 44842 1727204516.63868: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:56 -0400 (0:00:00.281) 0:00:26.807 ***** 44842 1727204516.63922: entering _queue_task() for managed-node1/service 44842 1727204516.64269: worker is 1 (out of 1 available) 44842 1727204516.64281: exiting _queue_task() for managed-node1/service 44842 1727204516.64293: done queuing things up, now waiting for results queue to drain 44842 1727204516.64295: waiting for pending results... 44842 1727204516.64601: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44842 1727204516.64747: in run() - task 0affcd87-79f5-aad0-d242-00000000007d 44842 1727204516.64769: variable 'ansible_search_path' from source: unknown 44842 1727204516.64777: variable 'ansible_search_path' from source: unknown 44842 1727204516.64817: calling self._execute() 44842 1727204516.64927: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204516.64939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204516.64956: variable 'omit' from source: magic vars 44842 1727204516.65371: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.65390: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204516.65584: variable 'network_provider' from source: set_fact 44842 1727204516.65595: variable 'network_state' from source: role '' defaults 44842 1727204516.65610: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44842 1727204516.65622: variable 'omit' from source: magic vars 44842 1727204516.65667: variable 'omit' from source: magic vars 44842 1727204516.65701: variable 'network_service_name' from source: role '' defaults 44842 1727204516.65782: variable 'network_service_name' from source: role '' defaults 44842 1727204516.65926: variable '__network_provider_setup' from source: role '' defaults 44842 1727204516.65938: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204516.66007: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204516.66020: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204516.66118: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204516.66353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204516.69171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204516.69261: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204516.69306: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204516.69350: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204516.69384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204516.69470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.69504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.69533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.69586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.69605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.69660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.69695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.69723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.69773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.69794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.70040: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44842 1727204516.70167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.70197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.70230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.70275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.70378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.70478: variable 'ansible_python' from source: facts 44842 1727204516.70507: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44842 1727204516.70628: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204516.70718: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204516.70851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.70885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.70914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.70956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.70979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.71030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204516.71069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204516.71119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.71219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204516.71241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204516.71398: variable 'network_connections' from source: play vars 44842 1727204516.71413: variable 'profile' from source: play vars 44842 1727204516.71494: variable 'profile' from source: play vars 44842 1727204516.71505: variable 'interface' from source: set_fact 44842 1727204516.71574: variable 'interface' from source: set_fact 44842 1727204516.71736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204516.71943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204516.72005: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204516.72056: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204516.72110: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204516.72250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204516.72287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204516.72327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204516.72366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204516.72421: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204516.72720: variable 'network_connections' from source: play vars 44842 1727204516.72735: variable 'profile' from source: play vars 44842 1727204516.72815: variable 'profile' from source: play vars 44842 1727204516.72827: variable 'interface' from source: set_fact 44842 1727204516.72897: variable 'interface' from source: set_fact 44842 1727204516.72935: variable '__network_packages_default_wireless' from source: role '' defaults 44842 1727204516.73025: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204516.73459: variable 'network_connections' from source: play vars 44842 1727204516.73471: variable 'profile' from source: play vars 44842 1727204516.73550: variable 'profile' from source: play vars 44842 1727204516.73560: variable 'interface' from source: set_fact 44842 1727204516.73643: variable 'interface' from source: set_fact 44842 1727204516.73677: variable '__network_packages_default_team' from source: role '' defaults 44842 1727204516.73762: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204516.74080: variable 'network_connections' from source: play vars 44842 1727204516.74089: variable 'profile' from source: play vars 44842 1727204516.74167: variable 'profile' from source: play vars 44842 1727204516.74178: variable 'interface' from source: set_fact 44842 1727204516.74255: variable 'interface' from source: set_fact 44842 1727204516.74317: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204516.74387: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204516.74445: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204516.74514: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204516.74871: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44842 1727204516.75600: variable 'network_connections' from source: play vars 44842 1727204516.75611: variable 'profile' from source: play vars 44842 1727204516.75685: variable 'profile' from source: play vars 44842 1727204516.75694: variable 'interface' from source: set_fact 44842 1727204516.75776: variable 'interface' from source: set_fact 44842 1727204516.75793: variable 'ansible_distribution' from source: facts 44842 1727204516.75802: variable '__network_rh_distros' from source: role '' defaults 44842 1727204516.75811: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.75830: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44842 1727204516.76020: variable 'ansible_distribution' from source: facts 44842 1727204516.76030: variable '__network_rh_distros' from source: role '' defaults 44842 1727204516.76040: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.76058: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44842 1727204516.76276: variable 'ansible_distribution' from source: facts 44842 1727204516.76286: variable '__network_rh_distros' from source: role '' defaults 44842 1727204516.76295: variable 'ansible_distribution_major_version' from source: facts 44842 1727204516.76342: variable 'network_provider' from source: set_fact 44842 1727204516.76373: variable 'omit' from source: magic vars 44842 1727204516.76408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204516.76445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204516.76475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204516.76500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204516.76516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204516.76555: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204516.76566: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204516.76574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204516.76872: Set connection var ansible_shell_type to sh 44842 1727204516.76899: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204516.76910: Set connection var ansible_connection to ssh 44842 1727204516.76920: Set connection var ansible_pipelining to False 44842 1727204516.76930: Set connection var ansible_timeout to 10 44842 1727204516.76942: Set connection var ansible_shell_executable to /bin/sh 44842 1727204516.76978: variable 'ansible_shell_executable' from source: unknown 44842 1727204516.76986: variable 'ansible_connection' from source: unknown 44842 1727204516.76993: variable 'ansible_module_compression' from source: unknown 44842 1727204516.76999: variable 'ansible_shell_type' from source: unknown 44842 1727204516.77006: variable 'ansible_shell_executable' from source: unknown 44842 1727204516.77012: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204516.77024: variable 'ansible_pipelining' from source: unknown 44842 1727204516.77030: variable 'ansible_timeout' from source: unknown 44842 1727204516.77037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204516.77151: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204516.77174: variable 'omit' from source: magic vars 44842 1727204516.77185: starting attempt loop 44842 1727204516.77192: running the handler 44842 1727204516.77285: variable 'ansible_facts' from source: unknown 44842 1727204516.78273: _low_level_execute_command(): starting 44842 1727204516.78287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204516.79051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204516.79072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204516.79089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.79108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.79156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204516.79172: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204516.79188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.79208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204516.79222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204516.79235: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204516.79251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204516.79267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.79284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.79295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204516.79305: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204516.79317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.79434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204516.79509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204516.79526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204516.79637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204516.81300: stdout chunk (state=3): >>>/root <<< 44842 1727204516.81513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204516.81518: stdout chunk (state=3): >>><<< 44842 1727204516.81521: stderr chunk (state=3): >>><<< 44842 1727204516.81563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204516.81788: _low_level_execute_command(): starting 44842 1727204516.81793: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650 `" && echo ansible-tmp-1727204516.8154087-46964-169792131169650="` echo /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650 `" ) && sleep 0' 44842 1727204516.82904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.82908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.82939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.82942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204516.82945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.83107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204516.83128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204516.83132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204516.83214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204516.85058: stdout chunk (state=3): >>>ansible-tmp-1727204516.8154087-46964-169792131169650=/root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650 <<< 44842 1727204516.85191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204516.85285: stderr chunk (state=3): >>><<< 44842 1727204516.85296: stdout chunk (state=3): >>><<< 44842 1727204516.85672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204516.8154087-46964-169792131169650=/root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204516.85681: variable 'ansible_module_compression' from source: unknown 44842 1727204516.85683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44842 1727204516.85687: variable 'ansible_facts' from source: unknown 44842 1727204516.85715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650/AnsiballZ_systemd.py 44842 1727204516.86052: Sending initial data 44842 1727204516.86055: Sent initial data (156 bytes) 44842 1727204516.87080: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204516.87089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204516.87098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.87113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.87171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.87177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204516.87223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.87286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204516.87292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204516.87369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204516.89095: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204516.89143: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204516.89192: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp31eet8oy /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650/AnsiballZ_systemd.py <<< 44842 1727204516.89256: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204516.91810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204516.92094: stderr chunk (state=3): >>><<< 44842 1727204516.92098: stdout chunk (state=3): >>><<< 44842 1727204516.92100: done transferring module to remote 44842 1727204516.92102: _low_level_execute_command(): starting 44842 1727204516.92105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650/ /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650/AnsiballZ_systemd.py && sleep 0' 44842 1727204516.92732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.92737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.92787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.92791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.92793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.92835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204516.92840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204516.92908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204516.94685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204516.94689: stderr chunk (state=3): >>><<< 44842 1727204516.94700: stdout chunk (state=3): >>><<< 44842 1727204516.94711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204516.94714: _low_level_execute_command(): starting 44842 1727204516.94719: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650/AnsiballZ_systemd.py && sleep 0' 44842 1727204516.95385: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204516.95394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204516.95404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.95426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.95474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204516.95478: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204516.95506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.95509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204516.95517: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204516.95553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204516.95557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204516.95563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204516.95566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204516.95568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204516.95570: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204516.95582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204516.95651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204516.95655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204516.95727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.20574: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14221312", "MemoryAvailable": "infinity", "CPUUsageNSec": "1649978000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": <<< 44842 1727204517.20581: stdout chunk (state=3): >>>"0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perp<<< 44842 1727204517.20590: stdout chunk (state=3): >>>etual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44842 1727204517.21976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204517.22031: stderr chunk (state=3): >>><<< 44842 1727204517.22036: stdout chunk (state=3): >>><<< 44842 1727204517.22054: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14221312", "MemoryAvailable": "infinity", "CPUUsageNSec": "1649978000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204517.22168: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204517.22182: _low_level_execute_command(): starting 44842 1727204517.22187: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204516.8154087-46964-169792131169650/ > /dev/null 2>&1 && sleep 0' 44842 1727204517.22659: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.22668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.22696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.22712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.22722: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.22771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204517.22785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204517.22848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.24629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204517.24701: stderr chunk (state=3): >>><<< 44842 1727204517.24705: stdout chunk (state=3): >>><<< 44842 1727204517.24718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204517.24726: handler run complete 44842 1727204517.24769: attempt loop complete, returning result 44842 1727204517.24773: _execute() done 44842 1727204517.24775: dumping result to json 44842 1727204517.24787: done dumping result, returning 44842 1727204517.24798: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-aad0-d242-00000000007d] 44842 1727204517.24801: sending task result for task 0affcd87-79f5-aad0-d242-00000000007d 44842 1727204517.25179: done sending task result for task 0affcd87-79f5-aad0-d242-00000000007d 44842 1727204517.25182: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204517.25233: no more pending results, returning what we have 44842 1727204517.25237: results queue empty 44842 1727204517.25237: checking for any_errors_fatal 44842 1727204517.25246: done checking for any_errors_fatal 44842 1727204517.25247: checking for max_fail_percentage 44842 1727204517.25249: done checking for max_fail_percentage 44842 1727204517.25250: checking to see if all hosts have failed and the running result is not ok 44842 1727204517.25251: done checking to see if all hosts have failed 44842 1727204517.25251: getting the remaining hosts for this loop 44842 1727204517.25253: done getting the remaining hosts for this loop 44842 1727204517.25257: getting the next task for host managed-node1 44842 1727204517.25262: done getting next task for host managed-node1 44842 1727204517.25267: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44842 1727204517.25269: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204517.25281: getting variables 44842 1727204517.25282: in VariableManager get_vars() 44842 1727204517.25315: Calling all_inventory to load vars for managed-node1 44842 1727204517.25317: Calling groups_inventory to load vars for managed-node1 44842 1727204517.25319: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204517.25328: Calling all_plugins_play to load vars for managed-node1 44842 1727204517.25330: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204517.25332: Calling groups_plugins_play to load vars for managed-node1 44842 1727204517.26177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204517.27244: done with get_vars() 44842 1727204517.27266: done getting variables 44842 1727204517.27311: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:57 -0400 (0:00:00.634) 0:00:27.441 ***** 44842 1727204517.27334: entering _queue_task() for managed-node1/service 44842 1727204517.27580: worker is 1 (out of 1 available) 44842 1727204517.27592: exiting _queue_task() for managed-node1/service 44842 1727204517.27606: done queuing things up, now waiting for results queue to drain 44842 1727204517.27608: waiting for pending results... 44842 1727204517.27807: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44842 1727204517.27890: in run() - task 0affcd87-79f5-aad0-d242-00000000007e 44842 1727204517.27902: variable 'ansible_search_path' from source: unknown 44842 1727204517.27906: variable 'ansible_search_path' from source: unknown 44842 1727204517.27938: calling self._execute() 44842 1727204517.28021: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204517.28025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204517.28037: variable 'omit' from source: magic vars 44842 1727204517.28334: variable 'ansible_distribution_major_version' from source: facts 44842 1727204517.28342: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204517.28430: variable 'network_provider' from source: set_fact 44842 1727204517.28434: Evaluated conditional (network_provider == "nm"): True 44842 1727204517.28508: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204517.28577: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204517.28709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204517.30321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204517.30370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204517.30400: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204517.30429: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204517.30450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204517.30524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204517.30546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204517.30569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204517.30596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204517.30607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204517.30642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204517.30660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204517.30681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204517.30706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204517.30719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204517.30748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204517.30768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204517.30786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204517.30810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204517.30821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204517.30928: variable 'network_connections' from source: play vars 44842 1727204517.30942: variable 'profile' from source: play vars 44842 1727204517.31000: variable 'profile' from source: play vars 44842 1727204517.31004: variable 'interface' from source: set_fact 44842 1727204517.31052: variable 'interface' from source: set_fact 44842 1727204517.31106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204517.31220: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204517.31248: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204517.31280: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204517.31301: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204517.31331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204517.31346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204517.31369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204517.31389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204517.31431: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204517.31601: variable 'network_connections' from source: play vars 44842 1727204517.31609: variable 'profile' from source: play vars 44842 1727204517.31653: variable 'profile' from source: play vars 44842 1727204517.31657: variable 'interface' from source: set_fact 44842 1727204517.31705: variable 'interface' from source: set_fact 44842 1727204517.31727: Evaluated conditional (__network_wpa_supplicant_required): False 44842 1727204517.31730: when evaluation is False, skipping this task 44842 1727204517.31733: _execute() done 44842 1727204517.31747: dumping result to json 44842 1727204517.31749: done dumping result, returning 44842 1727204517.31752: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-aad0-d242-00000000007e] 44842 1727204517.31754: sending task result for task 0affcd87-79f5-aad0-d242-00000000007e 44842 1727204517.31843: done sending task result for task 0affcd87-79f5-aad0-d242-00000000007e 44842 1727204517.31846: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44842 1727204517.31907: no more pending results, returning what we have 44842 1727204517.31911: results queue empty 44842 1727204517.31912: checking for any_errors_fatal 44842 1727204517.31935: done checking for any_errors_fatal 44842 1727204517.31936: checking for max_fail_percentage 44842 1727204517.31938: done checking for max_fail_percentage 44842 1727204517.31939: checking to see if all hosts have failed and the running result is not ok 44842 1727204517.31940: done checking to see if all hosts have failed 44842 1727204517.31941: getting the remaining hosts for this loop 44842 1727204517.31943: done getting the remaining hosts for this loop 44842 1727204517.31947: getting the next task for host managed-node1 44842 1727204517.31953: done getting next task for host managed-node1 44842 1727204517.31957: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44842 1727204517.31958: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204517.31975: getting variables 44842 1727204517.31977: in VariableManager get_vars() 44842 1727204517.32012: Calling all_inventory to load vars for managed-node1 44842 1727204517.32015: Calling groups_inventory to load vars for managed-node1 44842 1727204517.32017: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204517.32030: Calling all_plugins_play to load vars for managed-node1 44842 1727204517.32032: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204517.32035: Calling groups_plugins_play to load vars for managed-node1 44842 1727204517.32900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204517.33886: done with get_vars() 44842 1727204517.33909: done getting variables 44842 1727204517.33955: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:57 -0400 (0:00:00.066) 0:00:27.507 ***** 44842 1727204517.33983: entering _queue_task() for managed-node1/service 44842 1727204517.34226: worker is 1 (out of 1 available) 44842 1727204517.34241: exiting _queue_task() for managed-node1/service 44842 1727204517.34254: done queuing things up, now waiting for results queue to drain 44842 1727204517.34255: waiting for pending results... 44842 1727204517.34447: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 44842 1727204517.34517: in run() - task 0affcd87-79f5-aad0-d242-00000000007f 44842 1727204517.34528: variable 'ansible_search_path' from source: unknown 44842 1727204517.34531: variable 'ansible_search_path' from source: unknown 44842 1727204517.34567: calling self._execute() 44842 1727204517.34643: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204517.34647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204517.34656: variable 'omit' from source: magic vars 44842 1727204517.34950: variable 'ansible_distribution_major_version' from source: facts 44842 1727204517.34962: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204517.35040: variable 'network_provider' from source: set_fact 44842 1727204517.35044: Evaluated conditional (network_provider == "initscripts"): False 44842 1727204517.35047: when evaluation is False, skipping this task 44842 1727204517.35053: _execute() done 44842 1727204517.35056: dumping result to json 44842 1727204517.35058: done dumping result, returning 44842 1727204517.35067: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-aad0-d242-00000000007f] 44842 1727204517.35084: sending task result for task 0affcd87-79f5-aad0-d242-00000000007f 44842 1727204517.35170: done sending task result for task 0affcd87-79f5-aad0-d242-00000000007f 44842 1727204517.35173: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204517.35214: no more pending results, returning what we have 44842 1727204517.35217: results queue empty 44842 1727204517.35218: checking for any_errors_fatal 44842 1727204517.35226: done checking for any_errors_fatal 44842 1727204517.35227: checking for max_fail_percentage 44842 1727204517.35229: done checking for max_fail_percentage 44842 1727204517.35229: checking to see if all hosts have failed and the running result is not ok 44842 1727204517.35230: done checking to see if all hosts have failed 44842 1727204517.35231: getting the remaining hosts for this loop 44842 1727204517.35233: done getting the remaining hosts for this loop 44842 1727204517.35236: getting the next task for host managed-node1 44842 1727204517.35244: done getting next task for host managed-node1 44842 1727204517.35248: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44842 1727204517.35250: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204517.35267: getting variables 44842 1727204517.35269: in VariableManager get_vars() 44842 1727204517.35307: Calling all_inventory to load vars for managed-node1 44842 1727204517.35310: Calling groups_inventory to load vars for managed-node1 44842 1727204517.35312: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204517.35322: Calling all_plugins_play to load vars for managed-node1 44842 1727204517.35325: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204517.35327: Calling groups_plugins_play to load vars for managed-node1 44842 1727204517.36312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204517.37257: done with get_vars() 44842 1727204517.37278: done getting variables 44842 1727204517.37322: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:57 -0400 (0:00:00.033) 0:00:27.541 ***** 44842 1727204517.37344: entering _queue_task() for managed-node1/copy 44842 1727204517.37582: worker is 1 (out of 1 available) 44842 1727204517.37596: exiting _queue_task() for managed-node1/copy 44842 1727204517.37609: done queuing things up, now waiting for results queue to drain 44842 1727204517.37611: waiting for pending results... 44842 1727204517.37805: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44842 1727204517.37885: in run() - task 0affcd87-79f5-aad0-d242-000000000080 44842 1727204517.37898: variable 'ansible_search_path' from source: unknown 44842 1727204517.37906: variable 'ansible_search_path' from source: unknown 44842 1727204517.37936: calling self._execute() 44842 1727204517.38026: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204517.38030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204517.38040: variable 'omit' from source: magic vars 44842 1727204517.38325: variable 'ansible_distribution_major_version' from source: facts 44842 1727204517.38335: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204517.38423: variable 'network_provider' from source: set_fact 44842 1727204517.38427: Evaluated conditional (network_provider == "initscripts"): False 44842 1727204517.38432: when evaluation is False, skipping this task 44842 1727204517.38434: _execute() done 44842 1727204517.38437: dumping result to json 44842 1727204517.38447: done dumping result, returning 44842 1727204517.38453: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-aad0-d242-000000000080] 44842 1727204517.38461: sending task result for task 0affcd87-79f5-aad0-d242-000000000080 44842 1727204517.38550: done sending task result for task 0affcd87-79f5-aad0-d242-000000000080 44842 1727204517.38553: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44842 1727204517.38605: no more pending results, returning what we have 44842 1727204517.38609: results queue empty 44842 1727204517.38610: checking for any_errors_fatal 44842 1727204517.38616: done checking for any_errors_fatal 44842 1727204517.38617: checking for max_fail_percentage 44842 1727204517.38618: done checking for max_fail_percentage 44842 1727204517.38619: checking to see if all hosts have failed and the running result is not ok 44842 1727204517.38620: done checking to see if all hosts have failed 44842 1727204517.38621: getting the remaining hosts for this loop 44842 1727204517.38623: done getting the remaining hosts for this loop 44842 1727204517.38627: getting the next task for host managed-node1 44842 1727204517.38633: done getting next task for host managed-node1 44842 1727204517.38637: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44842 1727204517.38639: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204517.38652: getting variables 44842 1727204517.38654: in VariableManager get_vars() 44842 1727204517.38697: Calling all_inventory to load vars for managed-node1 44842 1727204517.38700: Calling groups_inventory to load vars for managed-node1 44842 1727204517.38702: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204517.38711: Calling all_plugins_play to load vars for managed-node1 44842 1727204517.38713: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204517.38715: Calling groups_plugins_play to load vars for managed-node1 44842 1727204517.39554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204517.40529: done with get_vars() 44842 1727204517.40547: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:57 -0400 (0:00:00.032) 0:00:27.574 ***** 44842 1727204517.40609: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44842 1727204517.40840: worker is 1 (out of 1 available) 44842 1727204517.40853: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44842 1727204517.40870: done queuing things up, now waiting for results queue to drain 44842 1727204517.40872: waiting for pending results... 44842 1727204517.41049: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44842 1727204517.41123: in run() - task 0affcd87-79f5-aad0-d242-000000000081 44842 1727204517.41134: variable 'ansible_search_path' from source: unknown 44842 1727204517.41138: variable 'ansible_search_path' from source: unknown 44842 1727204517.41169: calling self._execute() 44842 1727204517.41255: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204517.41259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204517.41273: variable 'omit' from source: magic vars 44842 1727204517.41561: variable 'ansible_distribution_major_version' from source: facts 44842 1727204517.41575: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204517.41580: variable 'omit' from source: magic vars 44842 1727204517.41608: variable 'omit' from source: magic vars 44842 1727204517.41725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204517.43510: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204517.43552: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204517.43583: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204517.43610: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204517.43629: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204517.43689: variable 'network_provider' from source: set_fact 44842 1727204517.43782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204517.43802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204517.43821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204517.43847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204517.43858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204517.43915: variable 'omit' from source: magic vars 44842 1727204517.43994: variable 'omit' from source: magic vars 44842 1727204517.44068: variable 'network_connections' from source: play vars 44842 1727204517.44077: variable 'profile' from source: play vars 44842 1727204517.44125: variable 'profile' from source: play vars 44842 1727204517.44129: variable 'interface' from source: set_fact 44842 1727204517.44175: variable 'interface' from source: set_fact 44842 1727204517.44277: variable 'omit' from source: magic vars 44842 1727204517.44284: variable '__lsr_ansible_managed' from source: task vars 44842 1727204517.44328: variable '__lsr_ansible_managed' from source: task vars 44842 1727204517.44518: Loaded config def from plugin (lookup/template) 44842 1727204517.44522: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44842 1727204517.44543: File lookup term: get_ansible_managed.j2 44842 1727204517.44551: variable 'ansible_search_path' from source: unknown 44842 1727204517.44566: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44842 1727204517.44575: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44842 1727204517.44588: variable 'ansible_search_path' from source: unknown 44842 1727204517.48411: variable 'ansible_managed' from source: unknown 44842 1727204517.48557: variable 'omit' from source: magic vars 44842 1727204517.48594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204517.48625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204517.48648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204517.48671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204517.48686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204517.48720: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204517.48728: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204517.48736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204517.48828: Set connection var ansible_shell_type to sh 44842 1727204517.48845: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204517.48854: Set connection var ansible_connection to ssh 44842 1727204517.48863: Set connection var ansible_pipelining to False 44842 1727204517.48876: Set connection var ansible_timeout to 10 44842 1727204517.48888: Set connection var ansible_shell_executable to /bin/sh 44842 1727204517.48914: variable 'ansible_shell_executable' from source: unknown 44842 1727204517.48922: variable 'ansible_connection' from source: unknown 44842 1727204517.48930: variable 'ansible_module_compression' from source: unknown 44842 1727204517.48936: variable 'ansible_shell_type' from source: unknown 44842 1727204517.48943: variable 'ansible_shell_executable' from source: unknown 44842 1727204517.48949: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204517.48955: variable 'ansible_pipelining' from source: unknown 44842 1727204517.48961: variable 'ansible_timeout' from source: unknown 44842 1727204517.48971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204517.49104: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204517.49129: variable 'omit' from source: magic vars 44842 1727204517.49141: starting attempt loop 44842 1727204517.49149: running the handler 44842 1727204517.49169: _low_level_execute_command(): starting 44842 1727204517.49180: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204517.49834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204517.49850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.49867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.49886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.49925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.49936: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204517.49950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.49970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204517.49983: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204517.49994: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204517.50007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.50021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.50039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.50052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.50076: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204517.50082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.50139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204517.50159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204517.50161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204517.50226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.51773: stdout chunk (state=3): >>>/root <<< 44842 1727204517.51876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204517.51944: stderr chunk (state=3): >>><<< 44842 1727204517.51949: stdout chunk (state=3): >>><<< 44842 1727204517.52056: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204517.52059: _low_level_execute_command(): starting 44842 1727204517.52062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255 `" && echo ansible-tmp-1727204517.519711-47040-280657675520255="` echo /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255 `" ) && sleep 0' 44842 1727204517.52642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204517.52655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.52675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.52694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.52740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.52752: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204517.52768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.52786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204517.52797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204517.52809: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204517.52824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.52837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.52853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.52866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.52877: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204517.52889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.52969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204517.52991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204517.53006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204517.53091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.54931: stdout chunk (state=3): >>>ansible-tmp-1727204517.519711-47040-280657675520255=/root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255 <<< 44842 1727204517.55044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204517.55135: stderr chunk (state=3): >>><<< 44842 1727204517.55145: stdout chunk (state=3): >>><<< 44842 1727204517.55469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204517.519711-47040-280657675520255=/root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204517.55477: variable 'ansible_module_compression' from source: unknown 44842 1727204517.55479: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44842 1727204517.55482: variable 'ansible_facts' from source: unknown 44842 1727204517.55484: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255/AnsiballZ_network_connections.py 44842 1727204517.55585: Sending initial data 44842 1727204517.55588: Sent initial data (167 bytes) 44842 1727204517.56607: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204517.56623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.56638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.56655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.56707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.56718: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204517.56731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.56747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204517.56757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204517.56770: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204517.56782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.56801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.56820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.56831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.56842: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204517.56854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.56939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204517.56965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204517.56982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204517.57070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.58770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204517.58816: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204517.58877: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp2s4jraew /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255/AnsiballZ_network_connections.py <<< 44842 1727204517.58926: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204517.60590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204517.60790: stderr chunk (state=3): >>><<< 44842 1727204517.60793: stdout chunk (state=3): >>><<< 44842 1727204517.60796: done transferring module to remote 44842 1727204517.60798: _low_level_execute_command(): starting 44842 1727204517.60800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255/ /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255/AnsiballZ_network_connections.py && sleep 0' 44842 1727204517.61411: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204517.61427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.61448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.61474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.61519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.61532: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204517.61547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.61575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204517.61588: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204517.61600: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204517.61612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.61625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.61643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.61655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.61672: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204517.61690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.61766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204517.61796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204517.61816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204517.61906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.63613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204517.63704: stderr chunk (state=3): >>><<< 44842 1727204517.63708: stdout chunk (state=3): >>><<< 44842 1727204517.63801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204517.63809: _low_level_execute_command(): starting 44842 1727204517.63811: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255/AnsiballZ_network_connections.py && sleep 0' 44842 1727204517.64384: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204517.64398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.64411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.64427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.64473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.64485: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204517.64498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.64514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204517.64524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204517.64534: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204517.64545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.64558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.64577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.64588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.64598: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204517.64610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.64687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204517.64708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204517.64723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204517.64812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.93001: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44842 1727204517.94728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204517.94733: stdout chunk (state=3): >>><<< 44842 1727204517.94735: stderr chunk (state=3): >>><<< 44842 1727204517.94881: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204517.94886: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204517.94888: _low_level_execute_command(): starting 44842 1727204517.94891: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204517.519711-47040-280657675520255/ > /dev/null 2>&1 && sleep 0' 44842 1727204517.95449: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204517.95467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.95484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.95503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.95545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.95561: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204517.95582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.95601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204517.95614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204517.95625: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204517.95637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204517.95653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204517.95672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204517.95686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204517.95697: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204517.95711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204517.95793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204517.95811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204517.95826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204517.95915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204517.97731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204517.97785: stderr chunk (state=3): >>><<< 44842 1727204517.97787: stdout chunk (state=3): >>><<< 44842 1727204517.97820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204517.97823: handler run complete 44842 1727204517.97826: attempt loop complete, returning result 44842 1727204517.97828: _execute() done 44842 1727204517.97830: dumping result to json 44842 1727204517.97836: done dumping result, returning 44842 1727204517.97845: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-aad0-d242-000000000081] 44842 1727204517.97851: sending task result for task 0affcd87-79f5-aad0-d242-000000000081 44842 1727204517.97947: done sending task result for task 0affcd87-79f5-aad0-d242-000000000081 44842 1727204517.97950: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 44842 1727204517.98039: no more pending results, returning what we have 44842 1727204517.98042: results queue empty 44842 1727204517.98043: checking for any_errors_fatal 44842 1727204517.98049: done checking for any_errors_fatal 44842 1727204517.98050: checking for max_fail_percentage 44842 1727204517.98052: done checking for max_fail_percentage 44842 1727204517.98052: checking to see if all hosts have failed and the running result is not ok 44842 1727204517.98053: done checking to see if all hosts have failed 44842 1727204517.98054: getting the remaining hosts for this loop 44842 1727204517.98056: done getting the remaining hosts for this loop 44842 1727204517.98063: getting the next task for host managed-node1 44842 1727204517.98068: done getting next task for host managed-node1 44842 1727204517.98074: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44842 1727204517.98076: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204517.98086: getting variables 44842 1727204517.98087: in VariableManager get_vars() 44842 1727204517.98124: Calling all_inventory to load vars for managed-node1 44842 1727204517.98127: Calling groups_inventory to load vars for managed-node1 44842 1727204517.98129: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204517.98137: Calling all_plugins_play to load vars for managed-node1 44842 1727204517.98140: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204517.98142: Calling groups_plugins_play to load vars for managed-node1 44842 1727204517.99381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.00683: done with get_vars() 44842 1727204518.00703: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.601) 0:00:28.175 ***** 44842 1727204518.00766: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44842 1727204518.00998: worker is 1 (out of 1 available) 44842 1727204518.01010: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44842 1727204518.01023: done queuing things up, now waiting for results queue to drain 44842 1727204518.01024: waiting for pending results... 44842 1727204518.01215: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 44842 1727204518.01299: in run() - task 0affcd87-79f5-aad0-d242-000000000082 44842 1727204518.01310: variable 'ansible_search_path' from source: unknown 44842 1727204518.01313: variable 'ansible_search_path' from source: unknown 44842 1727204518.01346: calling self._execute() 44842 1727204518.01429: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.01433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.01444: variable 'omit' from source: magic vars 44842 1727204518.01740: variable 'ansible_distribution_major_version' from source: facts 44842 1727204518.01750: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204518.01842: variable 'network_state' from source: role '' defaults 44842 1727204518.01851: Evaluated conditional (network_state != {}): False 44842 1727204518.01854: when evaluation is False, skipping this task 44842 1727204518.01858: _execute() done 44842 1727204518.01865: dumping result to json 44842 1727204518.01868: done dumping result, returning 44842 1727204518.01870: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-aad0-d242-000000000082] 44842 1727204518.01880: sending task result for task 0affcd87-79f5-aad0-d242-000000000082 44842 1727204518.01968: done sending task result for task 0affcd87-79f5-aad0-d242-000000000082 44842 1727204518.01971: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204518.02026: no more pending results, returning what we have 44842 1727204518.02047: results queue empty 44842 1727204518.02049: checking for any_errors_fatal 44842 1727204518.02063: done checking for any_errors_fatal 44842 1727204518.02066: checking for max_fail_percentage 44842 1727204518.02068: done checking for max_fail_percentage 44842 1727204518.02069: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.02070: done checking to see if all hosts have failed 44842 1727204518.02071: getting the remaining hosts for this loop 44842 1727204518.02073: done getting the remaining hosts for this loop 44842 1727204518.02077: getting the next task for host managed-node1 44842 1727204518.02083: done getting next task for host managed-node1 44842 1727204518.02086: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44842 1727204518.02088: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.02110: getting variables 44842 1727204518.02111: in VariableManager get_vars() 44842 1727204518.02178: Calling all_inventory to load vars for managed-node1 44842 1727204518.02180: Calling groups_inventory to load vars for managed-node1 44842 1727204518.02183: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.02192: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.02194: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.02197: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.03788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.05889: done with get_vars() 44842 1727204518.05924: done getting variables 44842 1727204518.06001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.052) 0:00:28.228 ***** 44842 1727204518.06053: entering _queue_task() for managed-node1/debug 44842 1727204518.06426: worker is 1 (out of 1 available) 44842 1727204518.06439: exiting _queue_task() for managed-node1/debug 44842 1727204518.06459: done queuing things up, now waiting for results queue to drain 44842 1727204518.06463: waiting for pending results... 44842 1727204518.06757: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44842 1727204518.06889: in run() - task 0affcd87-79f5-aad0-d242-000000000083 44842 1727204518.06922: variable 'ansible_search_path' from source: unknown 44842 1727204518.06931: variable 'ansible_search_path' from source: unknown 44842 1727204518.06976: calling self._execute() 44842 1727204518.07101: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.07129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.07144: variable 'omit' from source: magic vars 44842 1727204518.07587: variable 'ansible_distribution_major_version' from source: facts 44842 1727204518.07606: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204518.07618: variable 'omit' from source: magic vars 44842 1727204518.07675: variable 'omit' from source: magic vars 44842 1727204518.07715: variable 'omit' from source: magic vars 44842 1727204518.07766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204518.07820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204518.07848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204518.07877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.07904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.07941: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204518.07951: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.07962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.08081: Set connection var ansible_shell_type to sh 44842 1727204518.08105: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204518.08120: Set connection var ansible_connection to ssh 44842 1727204518.08129: Set connection var ansible_pipelining to False 44842 1727204518.08138: Set connection var ansible_timeout to 10 44842 1727204518.08147: Set connection var ansible_shell_executable to /bin/sh 44842 1727204518.08181: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.08189: variable 'ansible_connection' from source: unknown 44842 1727204518.08196: variable 'ansible_module_compression' from source: unknown 44842 1727204518.08202: variable 'ansible_shell_type' from source: unknown 44842 1727204518.08217: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.08227: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.08235: variable 'ansible_pipelining' from source: unknown 44842 1727204518.08242: variable 'ansible_timeout' from source: unknown 44842 1727204518.08250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.08411: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204518.08436: variable 'omit' from source: magic vars 44842 1727204518.08448: starting attempt loop 44842 1727204518.08454: running the handler 44842 1727204518.08607: variable '__network_connections_result' from source: set_fact 44842 1727204518.08679: handler run complete 44842 1727204518.08701: attempt loop complete, returning result 44842 1727204518.08709: _execute() done 44842 1727204518.08716: dumping result to json 44842 1727204518.08723: done dumping result, returning 44842 1727204518.08734: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-aad0-d242-000000000083] 44842 1727204518.08743: sending task result for task 0affcd87-79f5-aad0-d242-000000000083 44842 1727204518.08872: done sending task result for task 0affcd87-79f5-aad0-d242-000000000083 ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 44842 1727204518.08948: no more pending results, returning what we have 44842 1727204518.08951: results queue empty 44842 1727204518.08953: checking for any_errors_fatal 44842 1727204518.08962: done checking for any_errors_fatal 44842 1727204518.08963: checking for max_fail_percentage 44842 1727204518.08967: done checking for max_fail_percentage 44842 1727204518.08968: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.08969: done checking to see if all hosts have failed 44842 1727204518.08970: getting the remaining hosts for this loop 44842 1727204518.08973: done getting the remaining hosts for this loop 44842 1727204518.08978: getting the next task for host managed-node1 44842 1727204518.08985: done getting next task for host managed-node1 44842 1727204518.08990: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44842 1727204518.08992: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.09003: getting variables 44842 1727204518.09005: in VariableManager get_vars() 44842 1727204518.09045: Calling all_inventory to load vars for managed-node1 44842 1727204518.09048: Calling groups_inventory to load vars for managed-node1 44842 1727204518.09051: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.09066: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.09069: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.09072: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.10152: WORKER PROCESS EXITING 44842 1727204518.11069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.14234: done with get_vars() 44842 1727204518.14275: done getting variables 44842 1727204518.14462: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.084) 0:00:28.313 ***** 44842 1727204518.14498: entering _queue_task() for managed-node1/debug 44842 1727204518.15306: worker is 1 (out of 1 available) 44842 1727204518.15319: exiting _queue_task() for managed-node1/debug 44842 1727204518.15331: done queuing things up, now waiting for results queue to drain 44842 1727204518.15332: waiting for pending results... 44842 1727204518.15953: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44842 1727204518.16053: in run() - task 0affcd87-79f5-aad0-d242-000000000084 44842 1727204518.16069: variable 'ansible_search_path' from source: unknown 44842 1727204518.16073: variable 'ansible_search_path' from source: unknown 44842 1727204518.16108: calling self._execute() 44842 1727204518.16211: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.16216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.16226: variable 'omit' from source: magic vars 44842 1727204518.17295: variable 'ansible_distribution_major_version' from source: facts 44842 1727204518.17308: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204518.17315: variable 'omit' from source: magic vars 44842 1727204518.17357: variable 'omit' from source: magic vars 44842 1727204518.17392: variable 'omit' from source: magic vars 44842 1727204518.17436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204518.17875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204518.17896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204518.17913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.17925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.17962: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204518.17968: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.17971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.18273: Set connection var ansible_shell_type to sh 44842 1727204518.18286: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204518.18289: Set connection var ansible_connection to ssh 44842 1727204518.18296: Set connection var ansible_pipelining to False 44842 1727204518.18302: Set connection var ansible_timeout to 10 44842 1727204518.18309: Set connection var ansible_shell_executable to /bin/sh 44842 1727204518.18334: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.18337: variable 'ansible_connection' from source: unknown 44842 1727204518.18340: variable 'ansible_module_compression' from source: unknown 44842 1727204518.18343: variable 'ansible_shell_type' from source: unknown 44842 1727204518.18345: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.18347: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.18349: variable 'ansible_pipelining' from source: unknown 44842 1727204518.18353: variable 'ansible_timeout' from source: unknown 44842 1727204518.18355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.18708: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204518.18721: variable 'omit' from source: magic vars 44842 1727204518.18727: starting attempt loop 44842 1727204518.18730: running the handler 44842 1727204518.18782: variable '__network_connections_result' from source: set_fact 44842 1727204518.18868: variable '__network_connections_result' from source: set_fact 44842 1727204518.19173: handler run complete 44842 1727204518.19199: attempt loop complete, returning result 44842 1727204518.19203: _execute() done 44842 1727204518.19205: dumping result to json 44842 1727204518.19208: done dumping result, returning 44842 1727204518.19217: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-aad0-d242-000000000084] 44842 1727204518.19222: sending task result for task 0affcd87-79f5-aad0-d242-000000000084 44842 1727204518.19332: done sending task result for task 0affcd87-79f5-aad0-d242-000000000084 44842 1727204518.19337: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 44842 1727204518.19432: no more pending results, returning what we have 44842 1727204518.19435: results queue empty 44842 1727204518.19437: checking for any_errors_fatal 44842 1727204518.19443: done checking for any_errors_fatal 44842 1727204518.19444: checking for max_fail_percentage 44842 1727204518.19445: done checking for max_fail_percentage 44842 1727204518.19446: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.19449: done checking to see if all hosts have failed 44842 1727204518.19449: getting the remaining hosts for this loop 44842 1727204518.19452: done getting the remaining hosts for this loop 44842 1727204518.19456: getting the next task for host managed-node1 44842 1727204518.19465: done getting next task for host managed-node1 44842 1727204518.19470: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44842 1727204518.19472: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.19481: getting variables 44842 1727204518.19483: in VariableManager get_vars() 44842 1727204518.19518: Calling all_inventory to load vars for managed-node1 44842 1727204518.19520: Calling groups_inventory to load vars for managed-node1 44842 1727204518.19522: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.19532: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.19534: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.19537: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.23903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.27891: done with get_vars() 44842 1727204518.28041: done getting variables 44842 1727204518.28106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.137) 0:00:28.450 ***** 44842 1727204518.28258: entering _queue_task() for managed-node1/debug 44842 1727204518.28938: worker is 1 (out of 1 available) 44842 1727204518.28951: exiting _queue_task() for managed-node1/debug 44842 1727204518.28969: done queuing things up, now waiting for results queue to drain 44842 1727204518.28970: waiting for pending results... 44842 1727204518.29850: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44842 1727204518.30219: in run() - task 0affcd87-79f5-aad0-d242-000000000085 44842 1727204518.30239: variable 'ansible_search_path' from source: unknown 44842 1727204518.30246: variable 'ansible_search_path' from source: unknown 44842 1727204518.30292: calling self._execute() 44842 1727204518.30408: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.30512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.30530: variable 'omit' from source: magic vars 44842 1727204518.30968: variable 'ansible_distribution_major_version' from source: facts 44842 1727204518.30993: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204518.31123: variable 'network_state' from source: role '' defaults 44842 1727204518.31139: Evaluated conditional (network_state != {}): False 44842 1727204518.31145: when evaluation is False, skipping this task 44842 1727204518.31151: _execute() done 44842 1727204518.31157: dumping result to json 44842 1727204518.31168: done dumping result, returning 44842 1727204518.31178: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-aad0-d242-000000000085] 44842 1727204518.31195: sending task result for task 0affcd87-79f5-aad0-d242-000000000085 skipping: [managed-node1] => { "false_condition": "network_state != {}" } 44842 1727204518.31359: no more pending results, returning what we have 44842 1727204518.31368: results queue empty 44842 1727204518.31369: checking for any_errors_fatal 44842 1727204518.31383: done checking for any_errors_fatal 44842 1727204518.31383: checking for max_fail_percentage 44842 1727204518.31385: done checking for max_fail_percentage 44842 1727204518.31386: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.31388: done checking to see if all hosts have failed 44842 1727204518.31388: getting the remaining hosts for this loop 44842 1727204518.31390: done getting the remaining hosts for this loop 44842 1727204518.31395: getting the next task for host managed-node1 44842 1727204518.31402: done getting next task for host managed-node1 44842 1727204518.31406: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44842 1727204518.31409: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.31425: getting variables 44842 1727204518.31427: in VariableManager get_vars() 44842 1727204518.31476: Calling all_inventory to load vars for managed-node1 44842 1727204518.31479: Calling groups_inventory to load vars for managed-node1 44842 1727204518.31482: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.31494: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.31497: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.31501: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.32566: done sending task result for task 0affcd87-79f5-aad0-d242-000000000085 44842 1727204518.32571: WORKER PROCESS EXITING 44842 1727204518.33581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.35686: done with get_vars() 44842 1727204518.35716: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.075) 0:00:28.526 ***** 44842 1727204518.35825: entering _queue_task() for managed-node1/ping 44842 1727204518.36417: worker is 1 (out of 1 available) 44842 1727204518.36430: exiting _queue_task() for managed-node1/ping 44842 1727204518.36443: done queuing things up, now waiting for results queue to drain 44842 1727204518.36445: waiting for pending results... 44842 1727204518.37315: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 44842 1727204518.37409: in run() - task 0affcd87-79f5-aad0-d242-000000000086 44842 1727204518.37421: variable 'ansible_search_path' from source: unknown 44842 1727204518.37424: variable 'ansible_search_path' from source: unknown 44842 1727204518.37463: calling self._execute() 44842 1727204518.37564: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.37568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.37578: variable 'omit' from source: magic vars 44842 1727204518.37938: variable 'ansible_distribution_major_version' from source: facts 44842 1727204518.37952: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204518.37958: variable 'omit' from source: magic vars 44842 1727204518.38001: variable 'omit' from source: magic vars 44842 1727204518.38034: variable 'omit' from source: magic vars 44842 1727204518.38077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204518.38114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204518.38134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204518.38151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.38164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.38192: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204518.38195: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.38198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.38289: Set connection var ansible_shell_type to sh 44842 1727204518.38299: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204518.38304: Set connection var ansible_connection to ssh 44842 1727204518.38310: Set connection var ansible_pipelining to False 44842 1727204518.38315: Set connection var ansible_timeout to 10 44842 1727204518.38322: Set connection var ansible_shell_executable to /bin/sh 44842 1727204518.38343: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.38346: variable 'ansible_connection' from source: unknown 44842 1727204518.38349: variable 'ansible_module_compression' from source: unknown 44842 1727204518.38353: variable 'ansible_shell_type' from source: unknown 44842 1727204518.38355: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.38358: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.38362: variable 'ansible_pipelining' from source: unknown 44842 1727204518.38366: variable 'ansible_timeout' from source: unknown 44842 1727204518.38368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.38565: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204518.38574: variable 'omit' from source: magic vars 44842 1727204518.38579: starting attempt loop 44842 1727204518.38582: running the handler 44842 1727204518.38597: _low_level_execute_command(): starting 44842 1727204518.38605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204518.39504: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204518.39517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.39527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.39543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.39585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.39592: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204518.39603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.39617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204518.39626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204518.39635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204518.39639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.39651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.39667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.39707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.39716: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204518.39727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.39799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204518.40332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.40345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.40431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.42015: stdout chunk (state=3): >>>/root <<< 44842 1727204518.42187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204518.42191: stdout chunk (state=3): >>><<< 44842 1727204518.42201: stderr chunk (state=3): >>><<< 44842 1727204518.42222: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204518.42236: _low_level_execute_command(): starting 44842 1727204518.42243: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294 `" && echo ansible-tmp-1727204518.4222322-47078-278105584852294="` echo /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294 `" ) && sleep 0' 44842 1727204518.42879: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204518.42882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.42885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.42888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.42991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.42994: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204518.42996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.42999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204518.43001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204518.43003: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204518.43005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.43007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.43009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.43011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.43013: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204518.43014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.43105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204518.43108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.43110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.43348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.45198: stdout chunk (state=3): >>>ansible-tmp-1727204518.4222322-47078-278105584852294=/root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294 <<< 44842 1727204518.45367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204518.45373: stdout chunk (state=3): >>><<< 44842 1727204518.45380: stderr chunk (state=3): >>><<< 44842 1727204518.45400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204518.4222322-47078-278105584852294=/root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204518.45447: variable 'ansible_module_compression' from source: unknown 44842 1727204518.45493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44842 1727204518.45530: variable 'ansible_facts' from source: unknown 44842 1727204518.45604: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294/AnsiballZ_ping.py 44842 1727204518.45741: Sending initial data 44842 1727204518.45745: Sent initial data (153 bytes) 44842 1727204518.46607: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.46611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.46628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.46633: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.46643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204518.46649: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.46671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204518.46680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.46720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.46730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.46800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.48923: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204518.48932: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204518.48936: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpzu9ed0da /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294/AnsiballZ_ping.py <<< 44842 1727204518.48939: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204518.49880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204518.50050: stderr chunk (state=3): >>><<< 44842 1727204518.50054: stdout chunk (state=3): >>><<< 44842 1727204518.50056: done transferring module to remote 44842 1727204518.50058: _low_level_execute_command(): starting 44842 1727204518.50061: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294/ /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294/AnsiballZ_ping.py && sleep 0' 44842 1727204518.50597: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.50605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.50630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.50672: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.50757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204518.50775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.50875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.51078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.52780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204518.52827: stderr chunk (state=3): >>><<< 44842 1727204518.52830: stdout chunk (state=3): >>><<< 44842 1727204518.52849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204518.52852: _low_level_execute_command(): starting 44842 1727204518.52856: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294/AnsiballZ_ping.py && sleep 0' 44842 1727204518.53319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.53323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.53355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.53358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.53360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.53419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204518.53423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.53490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.66566: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44842 1727204518.67431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204518.67494: stderr chunk (state=3): >>><<< 44842 1727204518.67498: stdout chunk (state=3): >>><<< 44842 1727204518.67518: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204518.67540: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204518.67548: _low_level_execute_command(): starting 44842 1727204518.67552: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204518.4222322-47078-278105584852294/ > /dev/null 2>&1 && sleep 0' 44842 1727204518.68036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.68040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.68082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204518.68086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.68088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.68139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204518.68143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.68148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.68202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.69935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204518.69989: stderr chunk (state=3): >>><<< 44842 1727204518.69992: stdout chunk (state=3): >>><<< 44842 1727204518.70005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204518.70010: handler run complete 44842 1727204518.70025: attempt loop complete, returning result 44842 1727204518.70028: _execute() done 44842 1727204518.70031: dumping result to json 44842 1727204518.70033: done dumping result, returning 44842 1727204518.70043: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-aad0-d242-000000000086] 44842 1727204518.70053: sending task result for task 0affcd87-79f5-aad0-d242-000000000086 44842 1727204518.70142: done sending task result for task 0affcd87-79f5-aad0-d242-000000000086 44842 1727204518.70145: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 44842 1727204518.70204: no more pending results, returning what we have 44842 1727204518.70207: results queue empty 44842 1727204518.70208: checking for any_errors_fatal 44842 1727204518.70216: done checking for any_errors_fatal 44842 1727204518.70217: checking for max_fail_percentage 44842 1727204518.70219: done checking for max_fail_percentage 44842 1727204518.70219: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.70220: done checking to see if all hosts have failed 44842 1727204518.70221: getting the remaining hosts for this loop 44842 1727204518.70223: done getting the remaining hosts for this loop 44842 1727204518.70226: getting the next task for host managed-node1 44842 1727204518.70235: done getting next task for host managed-node1 44842 1727204518.70237: ^ task is: TASK: meta (role_complete) 44842 1727204518.70239: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.70249: getting variables 44842 1727204518.70251: in VariableManager get_vars() 44842 1727204518.70293: Calling all_inventory to load vars for managed-node1 44842 1727204518.70296: Calling groups_inventory to load vars for managed-node1 44842 1727204518.70298: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.70307: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.70309: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.70311: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.71304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.72978: done with get_vars() 44842 1727204518.73004: done getting variables 44842 1727204518.73089: done queuing things up, now waiting for results queue to drain 44842 1727204518.73091: results queue empty 44842 1727204518.73092: checking for any_errors_fatal 44842 1727204518.73095: done checking for any_errors_fatal 44842 1727204518.73096: checking for max_fail_percentage 44842 1727204518.73097: done checking for max_fail_percentage 44842 1727204518.73097: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.73098: done checking to see if all hosts have failed 44842 1727204518.73099: getting the remaining hosts for this loop 44842 1727204518.73100: done getting the remaining hosts for this loop 44842 1727204518.73103: getting the next task for host managed-node1 44842 1727204518.73106: done getting next task for host managed-node1 44842 1727204518.73107: ^ task is: TASK: meta (flush_handlers) 44842 1727204518.73109: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.73111: getting variables 44842 1727204518.73112: in VariableManager get_vars() 44842 1727204518.73124: Calling all_inventory to load vars for managed-node1 44842 1727204518.73126: Calling groups_inventory to load vars for managed-node1 44842 1727204518.73129: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.73134: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.73136: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.73139: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.74422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.76163: done with get_vars() 44842 1727204518.76190: done getting variables 44842 1727204518.76242: in VariableManager get_vars() 44842 1727204518.76256: Calling all_inventory to load vars for managed-node1 44842 1727204518.76259: Calling groups_inventory to load vars for managed-node1 44842 1727204518.76265: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.76271: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.76273: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.76276: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.77636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.79322: done with get_vars() 44842 1727204518.79351: done queuing things up, now waiting for results queue to drain 44842 1727204518.79353: results queue empty 44842 1727204518.79354: checking for any_errors_fatal 44842 1727204518.79355: done checking for any_errors_fatal 44842 1727204518.79356: checking for max_fail_percentage 44842 1727204518.79357: done checking for max_fail_percentage 44842 1727204518.79358: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.79359: done checking to see if all hosts have failed 44842 1727204518.79359: getting the remaining hosts for this loop 44842 1727204518.79362: done getting the remaining hosts for this loop 44842 1727204518.79366: getting the next task for host managed-node1 44842 1727204518.79370: done getting next task for host managed-node1 44842 1727204518.79372: ^ task is: TASK: meta (flush_handlers) 44842 1727204518.79378: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.79380: getting variables 44842 1727204518.79381: in VariableManager get_vars() 44842 1727204518.79393: Calling all_inventory to load vars for managed-node1 44842 1727204518.79395: Calling groups_inventory to load vars for managed-node1 44842 1727204518.79397: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.79401: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.79403: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.79406: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.80683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.82484: done with get_vars() 44842 1727204518.82505: done getting variables 44842 1727204518.82549: in VariableManager get_vars() 44842 1727204518.82566: Calling all_inventory to load vars for managed-node1 44842 1727204518.82568: Calling groups_inventory to load vars for managed-node1 44842 1727204518.82570: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.82575: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.82576: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.82579: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.83827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.85551: done with get_vars() 44842 1727204518.85585: done queuing things up, now waiting for results queue to drain 44842 1727204518.85587: results queue empty 44842 1727204518.85588: checking for any_errors_fatal 44842 1727204518.85589: done checking for any_errors_fatal 44842 1727204518.85590: checking for max_fail_percentage 44842 1727204518.85591: done checking for max_fail_percentage 44842 1727204518.85591: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.85592: done checking to see if all hosts have failed 44842 1727204518.85593: getting the remaining hosts for this loop 44842 1727204518.85594: done getting the remaining hosts for this loop 44842 1727204518.85596: getting the next task for host managed-node1 44842 1727204518.85599: done getting next task for host managed-node1 44842 1727204518.85600: ^ task is: None 44842 1727204518.85602: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.85602: done queuing things up, now waiting for results queue to drain 44842 1727204518.85603: results queue empty 44842 1727204518.85604: checking for any_errors_fatal 44842 1727204518.85605: done checking for any_errors_fatal 44842 1727204518.85605: checking for max_fail_percentage 44842 1727204518.85606: done checking for max_fail_percentage 44842 1727204518.85607: checking to see if all hosts have failed and the running result is not ok 44842 1727204518.85607: done checking to see if all hosts have failed 44842 1727204518.85609: getting the next task for host managed-node1 44842 1727204518.85611: done getting next task for host managed-node1 44842 1727204518.85611: ^ task is: None 44842 1727204518.85613: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.85673: in VariableManager get_vars() 44842 1727204518.85691: done with get_vars() 44842 1727204518.85696: in VariableManager get_vars() 44842 1727204518.85705: done with get_vars() 44842 1727204518.85709: variable 'omit' from source: magic vars 44842 1727204518.85738: in VariableManager get_vars() 44842 1727204518.85748: done with get_vars() 44842 1727204518.85772: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 44842 1727204518.85946: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44842 1727204518.85975: getting the remaining hosts for this loop 44842 1727204518.85977: done getting the remaining hosts for this loop 44842 1727204518.85979: getting the next task for host managed-node1 44842 1727204518.85982: done getting next task for host managed-node1 44842 1727204518.85984: ^ task is: TASK: Gathering Facts 44842 1727204518.85985: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204518.85987: getting variables 44842 1727204518.85988: in VariableManager get_vars() 44842 1727204518.85996: Calling all_inventory to load vars for managed-node1 44842 1727204518.85998: Calling groups_inventory to load vars for managed-node1 44842 1727204518.86001: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204518.86006: Calling all_plugins_play to load vars for managed-node1 44842 1727204518.86008: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204518.86011: Calling groups_plugins_play to load vars for managed-node1 44842 1727204518.87422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204518.89147: done with get_vars() 44842 1727204518.89176: done getting variables 44842 1727204518.89218: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.534) 0:00:29.060 ***** 44842 1727204518.89248: entering _queue_task() for managed-node1/gather_facts 44842 1727204518.89598: worker is 1 (out of 1 available) 44842 1727204518.89610: exiting _queue_task() for managed-node1/gather_facts 44842 1727204518.89622: done queuing things up, now waiting for results queue to drain 44842 1727204518.89624: waiting for pending results... 44842 1727204518.89923: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44842 1727204518.90051: in run() - task 0affcd87-79f5-aad0-d242-00000000057e 44842 1727204518.90080: variable 'ansible_search_path' from source: unknown 44842 1727204518.90125: calling self._execute() 44842 1727204518.90235: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.90248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.90268: variable 'omit' from source: magic vars 44842 1727204518.90681: variable 'ansible_distribution_major_version' from source: facts 44842 1727204518.90701: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204518.90714: variable 'omit' from source: magic vars 44842 1727204518.90750: variable 'omit' from source: magic vars 44842 1727204518.90795: variable 'omit' from source: magic vars 44842 1727204518.90845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204518.90889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204518.90916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204518.90943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.90959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204518.90999: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204518.91009: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.91017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.91125: Set connection var ansible_shell_type to sh 44842 1727204518.91142: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204518.91157: Set connection var ansible_connection to ssh 44842 1727204518.91175: Set connection var ansible_pipelining to False 44842 1727204518.91186: Set connection var ansible_timeout to 10 44842 1727204518.91198: Set connection var ansible_shell_executable to /bin/sh 44842 1727204518.91225: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.91234: variable 'ansible_connection' from source: unknown 44842 1727204518.91242: variable 'ansible_module_compression' from source: unknown 44842 1727204518.91249: variable 'ansible_shell_type' from source: unknown 44842 1727204518.91256: variable 'ansible_shell_executable' from source: unknown 44842 1727204518.91272: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204518.91281: variable 'ansible_pipelining' from source: unknown 44842 1727204518.91288: variable 'ansible_timeout' from source: unknown 44842 1727204518.91296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204518.91488: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204518.91508: variable 'omit' from source: magic vars 44842 1727204518.91518: starting attempt loop 44842 1727204518.91525: running the handler 44842 1727204518.91545: variable 'ansible_facts' from source: unknown 44842 1727204518.91576: _low_level_execute_command(): starting 44842 1727204518.91592: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204518.92376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204518.92392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.92407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.92427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.92477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.92492: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204518.92505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.92522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204518.92534: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204518.92545: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204518.92556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.92575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.92593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.92605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.92616: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204518.92629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.92706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204518.92722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.92735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.92839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.94469: stdout chunk (state=3): >>>/root <<< 44842 1727204518.94576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204518.94652: stderr chunk (state=3): >>><<< 44842 1727204518.94669: stdout chunk (state=3): >>><<< 44842 1727204518.94798: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204518.94801: _low_level_execute_command(): starting 44842 1727204518.94805: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276 `" && echo ansible-tmp-1727204518.9470282-47100-113646070113276="` echo /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276 `" ) && sleep 0' 44842 1727204518.95482: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.95486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.95524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.95528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.95531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.95602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.95616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.95705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204518.97548: stdout chunk (state=3): >>>ansible-tmp-1727204518.9470282-47100-113646070113276=/root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276 <<< 44842 1727204518.97669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204518.97754: stderr chunk (state=3): >>><<< 44842 1727204518.97771: stdout chunk (state=3): >>><<< 44842 1727204518.97970: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204518.9470282-47100-113646070113276=/root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204518.97974: variable 'ansible_module_compression' from source: unknown 44842 1727204518.97976: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44842 1727204518.98169: variable 'ansible_facts' from source: unknown 44842 1727204518.98173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276/AnsiballZ_setup.py 44842 1727204518.98321: Sending initial data 44842 1727204518.98324: Sent initial data (154 bytes) 44842 1727204518.99305: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204518.99319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.99335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.99354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.99402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.99415: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204518.99429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.99446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204518.99458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204518.99476: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204518.99488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204518.99502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204518.99518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204518.99530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204518.99541: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204518.99554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204518.99633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204518.99657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204518.99683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204518.99778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204519.01502: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204519.01548: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204519.01604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpopa3ku78 /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276/AnsiballZ_setup.py <<< 44842 1727204519.01655: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204519.04483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204519.04716: stderr chunk (state=3): >>><<< 44842 1727204519.04719: stdout chunk (state=3): >>><<< 44842 1727204519.04721: done transferring module to remote 44842 1727204519.04723: _low_level_execute_command(): starting 44842 1727204519.04725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276/ /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276/AnsiballZ_setup.py && sleep 0' 44842 1727204519.06029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204519.06682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204519.06698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204519.06715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204519.06768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204519.06781: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204519.06794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204519.06811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204519.06822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204519.06833: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204519.06845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204519.06857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204519.06878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204519.06890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204519.06900: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204519.06912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204519.06992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204519.07009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204519.07023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204519.07106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204519.08893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204519.08897: stdout chunk (state=3): >>><<< 44842 1727204519.08899: stderr chunk (state=3): >>><<< 44842 1727204519.09009: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204519.09013: _low_level_execute_command(): starting 44842 1727204519.09015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276/AnsiballZ_setup.py && sleep 0' 44842 1727204519.09596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204519.09610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204519.09623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204519.09641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204519.09689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204519.09701: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204519.09714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204519.09731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204519.09741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204519.09751: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204519.09767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204519.09781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204519.09795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204519.09806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204519.09817: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204519.09829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204519.09909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204519.09928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204519.09942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204519.10715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204519.62955: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "59", "epoch": "1727204519", "epoch_int": "1727204519", "date": "2024-09-24", "time": "15:01:59", "iso8601_micro": "2024-09-24T19:01:59.330296Z", "iso8601": "2024-09-24T19:01:59Z", "iso8601_basic": "20240924T150159330296", "iso8601_basic_short": "20240924T150159", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "peerethtest0", "rpltstbr", "lo", "ethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:d5:21:e5:60:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:21ff:fee5:60c0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fi<<< 44842 1727204519.62988: stdout chunk (state=3): >>>xed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "da:d5:74:1e:37:62", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f358:20b:dfcc:3e72", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::40d5:21ff:fee5:60c0", "fe80::108f:92ff:fee7:c1ab", "fe80::f358:20b:dfcc:3e72"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab", "fe80::40d5:21ff:fee5:60c0"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2761, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 771, "free": 2761}, "nocache": {"free": 3237, "used": 295}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 782, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271720448, "block_size": 4096, "block_total": 65519355, "block_available": 64519463, "block_used": 999892, "inode_total": 131071472, "inode_available": 130998228, "inode_used": 73244, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.37, "5m": 0.43, "15m": 0.28}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44842 1727204519.64650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204519.64653: stdout chunk (state=3): >>><<< 44842 1727204519.64656: stderr chunk (state=3): >>><<< 44842 1727204519.64874: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "59", "epoch": "1727204519", "epoch_int": "1727204519", "date": "2024-09-24", "time": "15:01:59", "iso8601_micro": "2024-09-24T19:01:59.330296Z", "iso8601": "2024-09-24T19:01:59Z", "iso8601_basic": "20240924T150159330296", "iso8601_basic_short": "20240924T150159", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "peerethtest0", "rpltstbr", "lo", "ethtest0"], "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:d5:21:e5:60:c0", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40d5:21ff:fee5:60c0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "da:d5:74:1e:37:62", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f358:20b:dfcc:3e72", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::40d5:21ff:fee5:60c0", "fe80::108f:92ff:fee7:c1ab", "fe80::f358:20b:dfcc:3e72"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab", "fe80::40d5:21ff:fee5:60c0"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2761, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 771, "free": 2761}, "nocache": {"free": 3237, "used": 295}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 782, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271720448, "block_size": 4096, "block_total": 65519355, "block_available": 64519463, "block_used": 999892, "inode_total": 131071472, "inode_available": 130998228, "inode_used": 73244, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.37, "5m": 0.43, "15m": 0.28}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204519.65340: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204519.65372: _low_level_execute_command(): starting 44842 1727204519.65383: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204518.9470282-47100-113646070113276/ > /dev/null 2>&1 && sleep 0' 44842 1727204519.66404: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204519.66407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204519.66444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204519.66448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204519.66450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204519.66521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204519.66534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204519.66622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204519.68559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204519.68568: stdout chunk (state=3): >>><<< 44842 1727204519.68571: stderr chunk (state=3): >>><<< 44842 1727204519.68873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204519.68877: handler run complete 44842 1727204519.68880: variable 'ansible_facts' from source: unknown 44842 1727204519.68911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204519.69369: variable 'ansible_facts' from source: unknown 44842 1727204519.69490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204519.69678: attempt loop complete, returning result 44842 1727204519.69689: _execute() done 44842 1727204519.69697: dumping result to json 44842 1727204519.69749: done dumping result, returning 44842 1727204519.69766: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-aad0-d242-00000000057e] 44842 1727204519.69778: sending task result for task 0affcd87-79f5-aad0-d242-00000000057e ok: [managed-node1] 44842 1727204519.70695: no more pending results, returning what we have 44842 1727204519.70698: results queue empty 44842 1727204519.70699: checking for any_errors_fatal 44842 1727204519.70700: done checking for any_errors_fatal 44842 1727204519.70701: checking for max_fail_percentage 44842 1727204519.70702: done checking for max_fail_percentage 44842 1727204519.70703: checking to see if all hosts have failed and the running result is not ok 44842 1727204519.70704: done checking to see if all hosts have failed 44842 1727204519.70705: getting the remaining hosts for this loop 44842 1727204519.70707: done getting the remaining hosts for this loop 44842 1727204519.70711: getting the next task for host managed-node1 44842 1727204519.70718: done getting next task for host managed-node1 44842 1727204519.70720: ^ task is: TASK: meta (flush_handlers) 44842 1727204519.70722: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204519.70727: getting variables 44842 1727204519.70729: in VariableManager get_vars() 44842 1727204519.70754: Calling all_inventory to load vars for managed-node1 44842 1727204519.70757: Calling groups_inventory to load vars for managed-node1 44842 1727204519.70763: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204519.70776: Calling all_plugins_play to load vars for managed-node1 44842 1727204519.70779: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204519.70782: Calling groups_plugins_play to load vars for managed-node1 44842 1727204519.71485: done sending task result for task 0affcd87-79f5-aad0-d242-00000000057e 44842 1727204519.71488: WORKER PROCESS EXITING 44842 1727204519.73396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204519.75908: done with get_vars() 44842 1727204519.75945: done getting variables 44842 1727204519.76019: in VariableManager get_vars() 44842 1727204519.76035: Calling all_inventory to load vars for managed-node1 44842 1727204519.76037: Calling groups_inventory to load vars for managed-node1 44842 1727204519.76041: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204519.76046: Calling all_plugins_play to load vars for managed-node1 44842 1727204519.76048: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204519.76051: Calling groups_plugins_play to load vars for managed-node1 44842 1727204519.79008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204519.82626: done with get_vars() 44842 1727204519.82666: done queuing things up, now waiting for results queue to drain 44842 1727204519.82668: results queue empty 44842 1727204519.82670: checking for any_errors_fatal 44842 1727204519.82675: done checking for any_errors_fatal 44842 1727204519.82676: checking for max_fail_percentage 44842 1727204519.82677: done checking for max_fail_percentage 44842 1727204519.82677: checking to see if all hosts have failed and the running result is not ok 44842 1727204519.82683: done checking to see if all hosts have failed 44842 1727204519.82684: getting the remaining hosts for this loop 44842 1727204519.82685: done getting the remaining hosts for this loop 44842 1727204519.82688: getting the next task for host managed-node1 44842 1727204519.82692: done getting next task for host managed-node1 44842 1727204519.82695: ^ task is: TASK: Include the task 'delete_interface.yml' 44842 1727204519.82696: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204519.82699: getting variables 44842 1727204519.82700: in VariableManager get_vars() 44842 1727204519.82710: Calling all_inventory to load vars for managed-node1 44842 1727204519.82712: Calling groups_inventory to load vars for managed-node1 44842 1727204519.82714: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204519.82720: Calling all_plugins_play to load vars for managed-node1 44842 1727204519.82722: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204519.82725: Calling groups_plugins_play to load vars for managed-node1 44842 1727204519.89642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204519.91386: done with get_vars() 44842 1727204519.91419: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 15:01:59 -0400 (0:00:01.022) 0:00:30.083 ***** 44842 1727204519.91493: entering _queue_task() for managed-node1/include_tasks 44842 1727204519.92637: worker is 1 (out of 1 available) 44842 1727204519.92651: exiting _queue_task() for managed-node1/include_tasks 44842 1727204519.92663: done queuing things up, now waiting for results queue to drain 44842 1727204519.92968: waiting for pending results... 44842 1727204519.93189: running TaskExecutor() for managed-node1/TASK: Include the task 'delete_interface.yml' 44842 1727204519.93491: in run() - task 0affcd87-79f5-aad0-d242-000000000089 44842 1727204519.93503: variable 'ansible_search_path' from source: unknown 44842 1727204519.93540: calling self._execute() 44842 1727204519.93823: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204519.93827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204519.93837: variable 'omit' from source: magic vars 44842 1727204519.94629: variable 'ansible_distribution_major_version' from source: facts 44842 1727204519.94756: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204519.94767: _execute() done 44842 1727204519.94772: dumping result to json 44842 1727204519.94775: done dumping result, returning 44842 1727204519.94781: done running TaskExecutor() for managed-node1/TASK: Include the task 'delete_interface.yml' [0affcd87-79f5-aad0-d242-000000000089] 44842 1727204519.94787: sending task result for task 0affcd87-79f5-aad0-d242-000000000089 44842 1727204519.94907: done sending task result for task 0affcd87-79f5-aad0-d242-000000000089 44842 1727204519.94911: WORKER PROCESS EXITING 44842 1727204519.94937: no more pending results, returning what we have 44842 1727204519.94942: in VariableManager get_vars() 44842 1727204519.94979: Calling all_inventory to load vars for managed-node1 44842 1727204519.94982: Calling groups_inventory to load vars for managed-node1 44842 1727204519.94987: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204519.95002: Calling all_plugins_play to load vars for managed-node1 44842 1727204519.95005: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204519.95009: Calling groups_plugins_play to load vars for managed-node1 44842 1727204519.97658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204519.99362: done with get_vars() 44842 1727204519.99396: variable 'ansible_search_path' from source: unknown 44842 1727204519.99413: we have included files to process 44842 1727204519.99414: generating all_blocks data 44842 1727204519.99416: done generating all_blocks data 44842 1727204519.99417: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44842 1727204519.99418: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44842 1727204519.99421: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44842 1727204519.99668: done processing included file 44842 1727204519.99670: iterating over new_blocks loaded from include file 44842 1727204519.99671: in VariableManager get_vars() 44842 1727204519.99687: done with get_vars() 44842 1727204519.99689: filtering new block on tags 44842 1727204519.99703: done filtering new block on tags 44842 1727204519.99705: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node1 44842 1727204519.99715: extending task lists for all hosts with included blocks 44842 1727204519.99746: done extending task lists 44842 1727204519.99747: done processing included files 44842 1727204519.99748: results queue empty 44842 1727204519.99748: checking for any_errors_fatal 44842 1727204519.99750: done checking for any_errors_fatal 44842 1727204519.99751: checking for max_fail_percentage 44842 1727204519.99752: done checking for max_fail_percentage 44842 1727204519.99753: checking to see if all hosts have failed and the running result is not ok 44842 1727204519.99753: done checking to see if all hosts have failed 44842 1727204519.99754: getting the remaining hosts for this loop 44842 1727204519.99755: done getting the remaining hosts for this loop 44842 1727204519.99758: getting the next task for host managed-node1 44842 1727204519.99761: done getting next task for host managed-node1 44842 1727204519.99763: ^ task is: TASK: Remove test interface if necessary 44842 1727204519.99767: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204519.99769: getting variables 44842 1727204519.99770: in VariableManager get_vars() 44842 1727204519.99778: Calling all_inventory to load vars for managed-node1 44842 1727204519.99780: Calling groups_inventory to load vars for managed-node1 44842 1727204519.99782: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204519.99787: Calling all_plugins_play to load vars for managed-node1 44842 1727204519.99790: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204519.99792: Calling groups_plugins_play to load vars for managed-node1 44842 1727204520.01259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204520.05142: done with get_vars() 44842 1727204520.05577: done getting variables 44842 1727204520.05620: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:02:00 -0400 (0:00:00.141) 0:00:30.224 ***** 44842 1727204520.05656: entering _queue_task() for managed-node1/command 44842 1727204520.06113: worker is 1 (out of 1 available) 44842 1727204520.06124: exiting _queue_task() for managed-node1/command 44842 1727204520.06135: done queuing things up, now waiting for results queue to drain 44842 1727204520.06137: waiting for pending results... 44842 1727204520.06980: running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary 44842 1727204520.07215: in run() - task 0affcd87-79f5-aad0-d242-00000000058f 44842 1727204520.07366: variable 'ansible_search_path' from source: unknown 44842 1727204520.07376: variable 'ansible_search_path' from source: unknown 44842 1727204520.07420: calling self._execute() 44842 1727204520.07640: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204520.07653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204520.07787: variable 'omit' from source: magic vars 44842 1727204520.08396: variable 'ansible_distribution_major_version' from source: facts 44842 1727204520.08557: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204520.08570: variable 'omit' from source: magic vars 44842 1727204520.08610: variable 'omit' from source: magic vars 44842 1727204520.08822: variable 'interface' from source: set_fact 44842 1727204520.08845: variable 'omit' from source: magic vars 44842 1727204520.09004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204520.09039: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204520.09061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204520.09199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204520.09217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204520.09256: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204520.09269: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204520.09278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204520.09384: Set connection var ansible_shell_type to sh 44842 1727204520.09527: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204520.09538: Set connection var ansible_connection to ssh 44842 1727204520.09548: Set connection var ansible_pipelining to False 44842 1727204520.09559: Set connection var ansible_timeout to 10 44842 1727204520.09575: Set connection var ansible_shell_executable to /bin/sh 44842 1727204520.09601: variable 'ansible_shell_executable' from source: unknown 44842 1727204520.09629: variable 'ansible_connection' from source: unknown 44842 1727204520.09637: variable 'ansible_module_compression' from source: unknown 44842 1727204520.09738: variable 'ansible_shell_type' from source: unknown 44842 1727204520.09746: variable 'ansible_shell_executable' from source: unknown 44842 1727204520.09754: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204520.09762: variable 'ansible_pipelining' from source: unknown 44842 1727204520.09772: variable 'ansible_timeout' from source: unknown 44842 1727204520.09780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204520.09919: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204520.10073: variable 'omit' from source: magic vars 44842 1727204520.10083: starting attempt loop 44842 1727204520.10089: running the handler 44842 1727204520.10108: _low_level_execute_command(): starting 44842 1727204520.10121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204520.12111: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.12118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.12240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.12244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.12246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204520.12248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.12309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.12369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.12373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.12467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.14119: stdout chunk (state=3): >>>/root <<< 44842 1727204520.14208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.14293: stderr chunk (state=3): >>><<< 44842 1727204520.14296: stdout chunk (state=3): >>><<< 44842 1727204520.14419: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204520.14423: _low_level_execute_command(): starting 44842 1727204520.14427: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294 `" && echo ansible-tmp-1727204520.1431928-47143-185106343217294="` echo /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294 `" ) && sleep 0' 44842 1727204520.15417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.15420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.15456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204520.15469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.15472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204520.15474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.15538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.15541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.16087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.16168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.18011: stdout chunk (state=3): >>>ansible-tmp-1727204520.1431928-47143-185106343217294=/root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294 <<< 44842 1727204520.18124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.18210: stderr chunk (state=3): >>><<< 44842 1727204520.18213: stdout chunk (state=3): >>><<< 44842 1727204520.18489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204520.1431928-47143-185106343217294=/root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204520.18492: variable 'ansible_module_compression' from source: unknown 44842 1727204520.18494: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204520.18497: variable 'ansible_facts' from source: unknown 44842 1727204520.18499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294/AnsiballZ_command.py 44842 1727204520.18679: Sending initial data 44842 1727204520.18682: Sent initial data (156 bytes) 44842 1727204520.19826: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204520.19842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.19859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.19880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.19932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.19945: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204520.19960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.19982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204520.19994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204520.20009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204520.20029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.20044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.20060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.20076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.20089: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204520.20104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.20187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.20210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.20232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.20325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.22037: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 44842 1727204520.22042: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204520.22087: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204520.22146: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpa6t8rbt3 /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294/AnsiballZ_command.py <<< 44842 1727204520.22198: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204520.23383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.23569: stderr chunk (state=3): >>><<< 44842 1727204520.23572: stdout chunk (state=3): >>><<< 44842 1727204520.23574: done transferring module to remote 44842 1727204520.23576: _low_level_execute_command(): starting 44842 1727204520.23675: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294/ /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294/AnsiballZ_command.py && sleep 0' 44842 1727204520.24619: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204520.24638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.24654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.24677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.24719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.24736: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204520.24756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.24784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204520.24797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204520.24810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204520.24822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.24846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.24866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.24879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.24889: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204520.24901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.24981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.25010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.25033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.25122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.26909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.26913: stdout chunk (state=3): >>><<< 44842 1727204520.26915: stderr chunk (state=3): >>><<< 44842 1727204520.27008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204520.27012: _low_level_execute_command(): starting 44842 1727204520.27015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294/AnsiballZ_command.py && sleep 0' 44842 1727204520.27554: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204520.27572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.27589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.27607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.27648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.27661: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204520.27683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.27702: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204520.27715: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204520.27726: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204520.27739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.27753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.27772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.27785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.27797: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204520.27811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.27892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.27913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.27930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.28183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.42727: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:02:00.409408", "end": "2024-09-24 15:02:00.426216", "delta": "0:00:00.016808", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204520.43957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204520.43960: stdout chunk (state=3): >>><<< 44842 1727204520.43963: stderr chunk (state=3): >>><<< 44842 1727204520.44109: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:02:00.409408", "end": "2024-09-24 15:02:00.426216", "delta": "0:00:00.016808", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204520.44114: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204520.44122: _low_level_execute_command(): starting 44842 1727204520.44124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204520.1431928-47143-185106343217294/ > /dev/null 2>&1 && sleep 0' 44842 1727204520.44733: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204520.44748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.44766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.44793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.44836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.44850: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204520.44866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.44887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204520.44906: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204520.44918: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204520.44931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.44945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.44961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.44984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.44998: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204520.45018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.45096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.45127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.45145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.45237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.46997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.47084: stderr chunk (state=3): >>><<< 44842 1727204520.47104: stdout chunk (state=3): >>><<< 44842 1727204520.47275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204520.47279: handler run complete 44842 1727204520.47281: Evaluated conditional (False): False 44842 1727204520.47283: attempt loop complete, returning result 44842 1727204520.47285: _execute() done 44842 1727204520.47287: dumping result to json 44842 1727204520.47289: done dumping result, returning 44842 1727204520.47291: done running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary [0affcd87-79f5-aad0-d242-00000000058f] 44842 1727204520.47293: sending task result for task 0affcd87-79f5-aad0-d242-00000000058f 44842 1727204520.47432: done sending task result for task 0affcd87-79f5-aad0-d242-00000000058f 44842 1727204520.47436: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.016808", "end": "2024-09-24 15:02:00.426216", "rc": 0, "start": "2024-09-24 15:02:00.409408" } 44842 1727204520.47507: no more pending results, returning what we have 44842 1727204520.47511: results queue empty 44842 1727204520.47512: checking for any_errors_fatal 44842 1727204520.47513: done checking for any_errors_fatal 44842 1727204520.47514: checking for max_fail_percentage 44842 1727204520.47516: done checking for max_fail_percentage 44842 1727204520.47517: checking to see if all hosts have failed and the running result is not ok 44842 1727204520.47518: done checking to see if all hosts have failed 44842 1727204520.47518: getting the remaining hosts for this loop 44842 1727204520.47520: done getting the remaining hosts for this loop 44842 1727204520.47524: getting the next task for host managed-node1 44842 1727204520.47534: done getting next task for host managed-node1 44842 1727204520.47536: ^ task is: TASK: meta (flush_handlers) 44842 1727204520.47539: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204520.47543: getting variables 44842 1727204520.47545: in VariableManager get_vars() 44842 1727204520.47580: Calling all_inventory to load vars for managed-node1 44842 1727204520.47583: Calling groups_inventory to load vars for managed-node1 44842 1727204520.47587: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204520.47599: Calling all_plugins_play to load vars for managed-node1 44842 1727204520.47602: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204520.47605: Calling groups_plugins_play to load vars for managed-node1 44842 1727204520.50525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204520.54117: done with get_vars() 44842 1727204520.54151: done getting variables 44842 1727204520.54340: in VariableManager get_vars() 44842 1727204520.54351: Calling all_inventory to load vars for managed-node1 44842 1727204520.54354: Calling groups_inventory to load vars for managed-node1 44842 1727204520.54357: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204520.54362: Calling all_plugins_play to load vars for managed-node1 44842 1727204520.54366: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204520.54369: Calling groups_plugins_play to load vars for managed-node1 44842 1727204520.57092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204520.60411: done with get_vars() 44842 1727204520.60451: done queuing things up, now waiting for results queue to drain 44842 1727204520.60453: results queue empty 44842 1727204520.60454: checking for any_errors_fatal 44842 1727204520.60457: done checking for any_errors_fatal 44842 1727204520.60458: checking for max_fail_percentage 44842 1727204520.60459: done checking for max_fail_percentage 44842 1727204520.60460: checking to see if all hosts have failed and the running result is not ok 44842 1727204520.60461: done checking to see if all hosts have failed 44842 1727204520.60461: getting the remaining hosts for this loop 44842 1727204520.60472: done getting the remaining hosts for this loop 44842 1727204520.60475: getting the next task for host managed-node1 44842 1727204520.60479: done getting next task for host managed-node1 44842 1727204520.60480: ^ task is: TASK: meta (flush_handlers) 44842 1727204520.60481: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204520.60484: getting variables 44842 1727204520.60485: in VariableManager get_vars() 44842 1727204520.60494: Calling all_inventory to load vars for managed-node1 44842 1727204520.60496: Calling groups_inventory to load vars for managed-node1 44842 1727204520.60498: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204520.60503: Calling all_plugins_play to load vars for managed-node1 44842 1727204520.60504: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204520.60507: Calling groups_plugins_play to load vars for managed-node1 44842 1727204520.62709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204520.65453: done with get_vars() 44842 1727204520.65485: done getting variables 44842 1727204520.65567: in VariableManager get_vars() 44842 1727204520.65578: Calling all_inventory to load vars for managed-node1 44842 1727204520.65580: Calling groups_inventory to load vars for managed-node1 44842 1727204520.65589: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204520.65595: Calling all_plugins_play to load vars for managed-node1 44842 1727204520.65597: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204520.65600: Calling groups_plugins_play to load vars for managed-node1 44842 1727204520.68139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204520.69931: done with get_vars() 44842 1727204520.70330: done queuing things up, now waiting for results queue to drain 44842 1727204520.70332: results queue empty 44842 1727204520.70333: checking for any_errors_fatal 44842 1727204520.70335: done checking for any_errors_fatal 44842 1727204520.70340: checking for max_fail_percentage 44842 1727204520.70341: done checking for max_fail_percentage 44842 1727204520.70342: checking to see if all hosts have failed and the running result is not ok 44842 1727204520.70343: done checking to see if all hosts have failed 44842 1727204520.70343: getting the remaining hosts for this loop 44842 1727204520.70344: done getting the remaining hosts for this loop 44842 1727204520.70351: getting the next task for host managed-node1 44842 1727204520.70354: done getting next task for host managed-node1 44842 1727204520.70355: ^ task is: None 44842 1727204520.70357: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204520.70358: done queuing things up, now waiting for results queue to drain 44842 1727204520.70359: results queue empty 44842 1727204520.70359: checking for any_errors_fatal 44842 1727204520.70360: done checking for any_errors_fatal 44842 1727204520.70361: checking for max_fail_percentage 44842 1727204520.70361: done checking for max_fail_percentage 44842 1727204520.70362: checking to see if all hosts have failed and the running result is not ok 44842 1727204520.70363: done checking to see if all hosts have failed 44842 1727204520.70367: getting the next task for host managed-node1 44842 1727204520.70369: done getting next task for host managed-node1 44842 1727204520.70370: ^ task is: None 44842 1727204520.70371: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204520.70420: in VariableManager get_vars() 44842 1727204520.70445: done with get_vars() 44842 1727204520.70460: in VariableManager get_vars() 44842 1727204520.70477: done with get_vars() 44842 1727204520.70482: variable 'omit' from source: magic vars 44842 1727204520.70606: variable 'profile' from source: play vars 44842 1727204520.70714: in VariableManager get_vars() 44842 1727204520.70728: done with get_vars() 44842 1727204520.70751: variable 'omit' from source: magic vars 44842 1727204520.70828: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 44842 1727204520.71379: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44842 1727204520.71399: getting the remaining hosts for this loop 44842 1727204520.71400: done getting the remaining hosts for this loop 44842 1727204520.71402: getting the next task for host managed-node1 44842 1727204520.71404: done getting next task for host managed-node1 44842 1727204520.71405: ^ task is: TASK: Gathering Facts 44842 1727204520.71406: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204520.71407: getting variables 44842 1727204520.71408: in VariableManager get_vars() 44842 1727204520.71415: Calling all_inventory to load vars for managed-node1 44842 1727204520.71417: Calling groups_inventory to load vars for managed-node1 44842 1727204520.71418: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204520.71422: Calling all_plugins_play to load vars for managed-node1 44842 1727204520.71423: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204520.71425: Calling groups_plugins_play to load vars for managed-node1 44842 1727204520.72510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204520.74585: done with get_vars() 44842 1727204520.74616: done getting variables 44842 1727204520.74679: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 15:02:00 -0400 (0:00:00.690) 0:00:30.915 ***** 44842 1727204520.74707: entering _queue_task() for managed-node1/gather_facts 44842 1727204520.75074: worker is 1 (out of 1 available) 44842 1727204520.75092: exiting _queue_task() for managed-node1/gather_facts 44842 1727204520.75104: done queuing things up, now waiting for results queue to drain 44842 1727204520.75105: waiting for pending results... 44842 1727204520.75288: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44842 1727204520.75353: in run() - task 0affcd87-79f5-aad0-d242-00000000059d 44842 1727204520.75367: variable 'ansible_search_path' from source: unknown 44842 1727204520.75399: calling self._execute() 44842 1727204520.75482: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204520.75485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204520.75495: variable 'omit' from source: magic vars 44842 1727204520.75791: variable 'ansible_distribution_major_version' from source: facts 44842 1727204520.75802: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204520.75808: variable 'omit' from source: magic vars 44842 1727204520.75828: variable 'omit' from source: magic vars 44842 1727204520.75854: variable 'omit' from source: magic vars 44842 1727204520.75891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204520.75918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204520.75936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204520.75957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204520.75966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204520.75992: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204520.75995: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204520.75998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204520.76072: Set connection var ansible_shell_type to sh 44842 1727204520.76080: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204520.76085: Set connection var ansible_connection to ssh 44842 1727204520.76092: Set connection var ansible_pipelining to False 44842 1727204520.76098: Set connection var ansible_timeout to 10 44842 1727204520.76105: Set connection var ansible_shell_executable to /bin/sh 44842 1727204520.76121: variable 'ansible_shell_executable' from source: unknown 44842 1727204520.76124: variable 'ansible_connection' from source: unknown 44842 1727204520.76127: variable 'ansible_module_compression' from source: unknown 44842 1727204520.76129: variable 'ansible_shell_type' from source: unknown 44842 1727204520.76132: variable 'ansible_shell_executable' from source: unknown 44842 1727204520.76134: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204520.76137: variable 'ansible_pipelining' from source: unknown 44842 1727204520.76139: variable 'ansible_timeout' from source: unknown 44842 1727204520.76143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204520.76285: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204520.76296: variable 'omit' from source: magic vars 44842 1727204520.76300: starting attempt loop 44842 1727204520.76303: running the handler 44842 1727204520.76318: variable 'ansible_facts' from source: unknown 44842 1727204520.76334: _low_level_execute_command(): starting 44842 1727204520.76341: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204520.77126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.77226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.78749: stdout chunk (state=3): >>>/root <<< 44842 1727204520.78881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.78908: stderr chunk (state=3): >>><<< 44842 1727204520.78911: stdout chunk (state=3): >>><<< 44842 1727204520.78930: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204520.78943: _low_level_execute_command(): starting 44842 1727204520.78949: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039 `" && echo ansible-tmp-1727204520.789317-47179-272456184298039="` echo /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039 `" ) && sleep 0' 44842 1727204520.79414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.79419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.79452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.79455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204520.79467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.79515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.79518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.79580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.81415: stdout chunk (state=3): >>>ansible-tmp-1727204520.789317-47179-272456184298039=/root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039 <<< 44842 1727204520.81526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.81578: stderr chunk (state=3): >>><<< 44842 1727204520.81582: stdout chunk (state=3): >>><<< 44842 1727204520.81599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204520.789317-47179-272456184298039=/root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204520.81628: variable 'ansible_module_compression' from source: unknown 44842 1727204520.81673: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44842 1727204520.81723: variable 'ansible_facts' from source: unknown 44842 1727204520.81841: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039/AnsiballZ_setup.py 44842 1727204520.81969: Sending initial data 44842 1727204520.81980: Sent initial data (153 bytes) 44842 1727204520.82647: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.82651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.82697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204520.82700: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.82705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.82707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204520.82709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.82756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204520.82765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.82822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.84516: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204520.84567: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204520.84619: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpvz7zhq8x /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039/AnsiballZ_setup.py <<< 44842 1727204520.84672: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204520.86394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.86508: stderr chunk (state=3): >>><<< 44842 1727204520.86511: stdout chunk (state=3): >>><<< 44842 1727204520.86529: done transferring module to remote 44842 1727204520.86538: _low_level_execute_command(): starting 44842 1727204520.86544: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039/ /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039/AnsiballZ_setup.py && sleep 0' 44842 1727204520.87004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.87016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.87042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204520.87054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.87100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.87113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.87177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204520.88877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204520.88926: stderr chunk (state=3): >>><<< 44842 1727204520.88930: stdout chunk (state=3): >>><<< 44842 1727204520.88946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204520.88953: _low_level_execute_command(): starting 44842 1727204520.88958: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039/AnsiballZ_setup.py && sleep 0' 44842 1727204520.89430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204520.89443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204520.89461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204520.89478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204520.89488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204520.89533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204520.89546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204520.89611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204522.40680: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2760, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 772, "free": 2760}, "nocache": {"free": 3236, "used": 296}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 784, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271720448, "block_size": 4096, "block_total": 65519355, "block_available": 64519463, "block_used": 999892, "inode_total": 131071472, "inode_available": 130998228, "inode_used": 73244, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_selinux_python_present": true, "ansible_<<< 44842 1727204522.40688: stdout chunk (state=3): >>>selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.34, "5m": 0.42, "15m": 0.28}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "02", "epoch": "1727204522", "epoch_int": "1727204522", "date": "2024-09-24", "time": "15:02:02", "iso8601_micro": "2024-09-24T19:02:02.353931Z", "iso8601": "2024-09-24T19:02:02Z", "iso8601_basic": "20240924T150202353931", "iso8601_basic_short": "20240924T150202", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "of<<< 44842 1727204522.40696: stdout chunk (state=3): >>>f [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44842 1727204522.42354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204522.42497: stderr chunk (state=3): >>><<< 44842 1727204522.42502: stdout chunk (state=3): >>><<< 44842 1727204522.42706: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2760, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 772, "free": 2760}, "nocache": {"free": 3236, "used": 296}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 784, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271720448, "block_size": 4096, "block_total": 65519355, "block_available": 64519463, "block_used": 999892, "inode_total": 131071472, "inode_available": 130998228, "inode_used": 73244, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.34, "5m": 0.42, "15m": 0.28}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "02", "epoch": "1727204522", "epoch_int": "1727204522", "date": "2024-09-24", "time": "15:02:02", "iso8601_micro": "2024-09-24T19:02:02.353931Z", "iso8601": "2024-09-24T19:02:02Z", "iso8601_basic": "20240924T150202353931", "iso8601_basic_short": "20240924T150202", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204522.43080: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204522.43108: _low_level_execute_command(): starting 44842 1727204522.43119: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204520.789317-47179-272456184298039/ > /dev/null 2>&1 && sleep 0' 44842 1727204522.45430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204522.45611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204522.45634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204522.45668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204522.45763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204522.45843: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204522.45881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204522.45911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204522.45943: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204522.45956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204522.45971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204522.45985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204522.46016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204522.46046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204522.46071: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204522.46108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204522.46214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204522.46380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204522.46400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204522.46533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204522.48671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204522.48710: stderr chunk (state=3): >>><<< 44842 1727204522.48714: stdout chunk (state=3): >>><<< 44842 1727204522.48773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204522.48776: handler run complete 44842 1727204522.49070: variable 'ansible_facts' from source: unknown 44842 1727204522.49073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.49316: variable 'ansible_facts' from source: unknown 44842 1727204522.49604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.50078: attempt loop complete, returning result 44842 1727204522.50096: _execute() done 44842 1727204522.50124: dumping result to json 44842 1727204522.50212: done dumping result, returning 44842 1727204522.50260: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-aad0-d242-00000000059d] 44842 1727204522.50304: sending task result for task 0affcd87-79f5-aad0-d242-00000000059d ok: [managed-node1] 44842 1727204522.51325: no more pending results, returning what we have 44842 1727204522.51329: results queue empty 44842 1727204522.51330: checking for any_errors_fatal 44842 1727204522.51331: done checking for any_errors_fatal 44842 1727204522.51332: checking for max_fail_percentage 44842 1727204522.51334: done checking for max_fail_percentage 44842 1727204522.51335: checking to see if all hosts have failed and the running result is not ok 44842 1727204522.51336: done checking to see if all hosts have failed 44842 1727204522.51336: getting the remaining hosts for this loop 44842 1727204522.51339: done getting the remaining hosts for this loop 44842 1727204522.51343: getting the next task for host managed-node1 44842 1727204522.51350: done getting next task for host managed-node1 44842 1727204522.51352: ^ task is: TASK: meta (flush_handlers) 44842 1727204522.51354: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204522.51359: getting variables 44842 1727204522.51363: in VariableManager get_vars() 44842 1727204522.51400: Calling all_inventory to load vars for managed-node1 44842 1727204522.51403: Calling groups_inventory to load vars for managed-node1 44842 1727204522.51406: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204522.51419: Calling all_plugins_play to load vars for managed-node1 44842 1727204522.51422: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204522.51425: Calling groups_plugins_play to load vars for managed-node1 44842 1727204522.52291: done sending task result for task 0affcd87-79f5-aad0-d242-00000000059d 44842 1727204522.52296: WORKER PROCESS EXITING 44842 1727204522.53381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.55552: done with get_vars() 44842 1727204522.55589: done getting variables 44842 1727204522.55657: in VariableManager get_vars() 44842 1727204522.55791: Calling all_inventory to load vars for managed-node1 44842 1727204522.55794: Calling groups_inventory to load vars for managed-node1 44842 1727204522.55796: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204522.55801: Calling all_plugins_play to load vars for managed-node1 44842 1727204522.55804: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204522.55811: Calling groups_plugins_play to load vars for managed-node1 44842 1727204522.57769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.59890: done with get_vars() 44842 1727204522.59923: done queuing things up, now waiting for results queue to drain 44842 1727204522.59925: results queue empty 44842 1727204522.59926: checking for any_errors_fatal 44842 1727204522.59930: done checking for any_errors_fatal 44842 1727204522.59931: checking for max_fail_percentage 44842 1727204522.59932: done checking for max_fail_percentage 44842 1727204522.59933: checking to see if all hosts have failed and the running result is not ok 44842 1727204522.59934: done checking to see if all hosts have failed 44842 1727204522.59934: getting the remaining hosts for this loop 44842 1727204522.59935: done getting the remaining hosts for this loop 44842 1727204522.59939: getting the next task for host managed-node1 44842 1727204522.59943: done getting next task for host managed-node1 44842 1727204522.59946: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44842 1727204522.59948: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204522.59958: getting variables 44842 1727204522.59959: in VariableManager get_vars() 44842 1727204522.59978: Calling all_inventory to load vars for managed-node1 44842 1727204522.59981: Calling groups_inventory to load vars for managed-node1 44842 1727204522.59983: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204522.59993: Calling all_plugins_play to load vars for managed-node1 44842 1727204522.59996: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204522.59999: Calling groups_plugins_play to load vars for managed-node1 44842 1727204522.61353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.63410: done with get_vars() 44842 1727204522.63444: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:02:02 -0400 (0:00:01.888) 0:00:32.803 ***** 44842 1727204522.63532: entering _queue_task() for managed-node1/include_tasks 44842 1727204522.63886: worker is 1 (out of 1 available) 44842 1727204522.63899: exiting _queue_task() for managed-node1/include_tasks 44842 1727204522.63912: done queuing things up, now waiting for results queue to drain 44842 1727204522.63913: waiting for pending results... 44842 1727204522.64305: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44842 1727204522.64431: in run() - task 0affcd87-79f5-aad0-d242-000000000091 44842 1727204522.64525: variable 'ansible_search_path' from source: unknown 44842 1727204522.64534: variable 'ansible_search_path' from source: unknown 44842 1727204522.64599: calling self._execute() 44842 1727204522.64714: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204522.64731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204522.64749: variable 'omit' from source: magic vars 44842 1727204522.65269: variable 'ansible_distribution_major_version' from source: facts 44842 1727204522.65331: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204522.65345: _execute() done 44842 1727204522.65352: dumping result to json 44842 1727204522.65359: done dumping result, returning 44842 1727204522.65376: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-aad0-d242-000000000091] 44842 1727204522.65391: sending task result for task 0affcd87-79f5-aad0-d242-000000000091 44842 1727204522.65542: no more pending results, returning what we have 44842 1727204522.65548: in VariableManager get_vars() 44842 1727204522.65605: Calling all_inventory to load vars for managed-node1 44842 1727204522.65608: Calling groups_inventory to load vars for managed-node1 44842 1727204522.65611: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204522.65625: Calling all_plugins_play to load vars for managed-node1 44842 1727204522.65629: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204522.65632: Calling groups_plugins_play to load vars for managed-node1 44842 1727204522.66839: done sending task result for task 0affcd87-79f5-aad0-d242-000000000091 44842 1727204522.66843: WORKER PROCESS EXITING 44842 1727204522.68241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.69982: done with get_vars() 44842 1727204522.70018: variable 'ansible_search_path' from source: unknown 44842 1727204522.70020: variable 'ansible_search_path' from source: unknown 44842 1727204522.70245: we have included files to process 44842 1727204522.70247: generating all_blocks data 44842 1727204522.70248: done generating all_blocks data 44842 1727204522.70249: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204522.70250: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204522.70254: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44842 1727204522.70951: done processing included file 44842 1727204522.70954: iterating over new_blocks loaded from include file 44842 1727204522.70955: in VariableManager get_vars() 44842 1727204522.70984: done with get_vars() 44842 1727204522.70986: filtering new block on tags 44842 1727204522.71004: done filtering new block on tags 44842 1727204522.71006: in VariableManager get_vars() 44842 1727204522.71026: done with get_vars() 44842 1727204522.71027: filtering new block on tags 44842 1727204522.71045: done filtering new block on tags 44842 1727204522.71048: in VariableManager get_vars() 44842 1727204522.71071: done with get_vars() 44842 1727204522.71072: filtering new block on tags 44842 1727204522.71089: done filtering new block on tags 44842 1727204522.71091: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 44842 1727204522.71097: extending task lists for all hosts with included blocks 44842 1727204522.71486: done extending task lists 44842 1727204522.71488: done processing included files 44842 1727204522.71488: results queue empty 44842 1727204522.71489: checking for any_errors_fatal 44842 1727204522.71491: done checking for any_errors_fatal 44842 1727204522.71492: checking for max_fail_percentage 44842 1727204522.71493: done checking for max_fail_percentage 44842 1727204522.71494: checking to see if all hosts have failed and the running result is not ok 44842 1727204522.71494: done checking to see if all hosts have failed 44842 1727204522.71495: getting the remaining hosts for this loop 44842 1727204522.71497: done getting the remaining hosts for this loop 44842 1727204522.71499: getting the next task for host managed-node1 44842 1727204522.71503: done getting next task for host managed-node1 44842 1727204522.71506: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44842 1727204522.71509: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204522.71518: getting variables 44842 1727204522.71519: in VariableManager get_vars() 44842 1727204522.71533: Calling all_inventory to load vars for managed-node1 44842 1727204522.71535: Calling groups_inventory to load vars for managed-node1 44842 1727204522.71537: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204522.71543: Calling all_plugins_play to load vars for managed-node1 44842 1727204522.71545: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204522.71548: Calling groups_plugins_play to load vars for managed-node1 44842 1727204522.73147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.75244: done with get_vars() 44842 1727204522.75285: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:02:02 -0400 (0:00:00.118) 0:00:32.921 ***** 44842 1727204522.75378: entering _queue_task() for managed-node1/setup 44842 1727204522.76018: worker is 1 (out of 1 available) 44842 1727204522.76032: exiting _queue_task() for managed-node1/setup 44842 1727204522.76045: done queuing things up, now waiting for results queue to drain 44842 1727204522.76047: waiting for pending results... 44842 1727204522.77052: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44842 1727204522.77447: in run() - task 0affcd87-79f5-aad0-d242-0000000005de 44842 1727204522.77473: variable 'ansible_search_path' from source: unknown 44842 1727204522.77482: variable 'ansible_search_path' from source: unknown 44842 1727204522.77527: calling self._execute() 44842 1727204522.77744: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204522.77762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204522.77780: variable 'omit' from source: magic vars 44842 1727204522.78775: variable 'ansible_distribution_major_version' from source: facts 44842 1727204522.78943: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204522.79456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204522.82415: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204522.82499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204522.82549: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204522.82594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204522.82689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204522.82844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204522.82943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204522.82981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204522.83024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204522.83040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204522.83106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204522.83208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204522.83237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204522.83286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204522.83306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204522.83480: variable '__network_required_facts' from source: role '' defaults 44842 1727204522.83495: variable 'ansible_facts' from source: unknown 44842 1727204522.84717: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44842 1727204522.84726: when evaluation is False, skipping this task 44842 1727204522.84732: _execute() done 44842 1727204522.84738: dumping result to json 44842 1727204522.84744: done dumping result, returning 44842 1727204522.84754: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-aad0-d242-0000000005de] 44842 1727204522.84769: sending task result for task 0affcd87-79f5-aad0-d242-0000000005de skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204522.84921: no more pending results, returning what we have 44842 1727204522.84925: results queue empty 44842 1727204522.84926: checking for any_errors_fatal 44842 1727204522.84928: done checking for any_errors_fatal 44842 1727204522.84928: checking for max_fail_percentage 44842 1727204522.84931: done checking for max_fail_percentage 44842 1727204522.84932: checking to see if all hosts have failed and the running result is not ok 44842 1727204522.84932: done checking to see if all hosts have failed 44842 1727204522.84933: getting the remaining hosts for this loop 44842 1727204522.84935: done getting the remaining hosts for this loop 44842 1727204522.84939: getting the next task for host managed-node1 44842 1727204522.84949: done getting next task for host managed-node1 44842 1727204522.84953: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44842 1727204522.84957: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204522.84977: getting variables 44842 1727204522.84979: in VariableManager get_vars() 44842 1727204522.85023: Calling all_inventory to load vars for managed-node1 44842 1727204522.85026: Calling groups_inventory to load vars for managed-node1 44842 1727204522.85029: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204522.85040: Calling all_plugins_play to load vars for managed-node1 44842 1727204522.85043: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204522.85046: Calling groups_plugins_play to load vars for managed-node1 44842 1727204522.86087: done sending task result for task 0affcd87-79f5-aad0-d242-0000000005de 44842 1727204522.86091: WORKER PROCESS EXITING 44842 1727204522.88483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204522.94447: done with get_vars() 44842 1727204522.94491: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:02:02 -0400 (0:00:00.192) 0:00:33.114 ***** 44842 1727204522.94594: entering _queue_task() for managed-node1/stat 44842 1727204522.95947: worker is 1 (out of 1 available) 44842 1727204522.95958: exiting _queue_task() for managed-node1/stat 44842 1727204522.95973: done queuing things up, now waiting for results queue to drain 44842 1727204522.95975: waiting for pending results... 44842 1727204522.96747: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 44842 1727204522.97003: in run() - task 0affcd87-79f5-aad0-d242-0000000005e0 44842 1727204522.97088: variable 'ansible_search_path' from source: unknown 44842 1727204522.97099: variable 'ansible_search_path' from source: unknown 44842 1727204522.97140: calling self._execute() 44842 1727204522.97465: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204522.97478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204522.97492: variable 'omit' from source: magic vars 44842 1727204522.97862: variable 'ansible_distribution_major_version' from source: facts 44842 1727204522.98186: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204522.98356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204522.98832: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204522.99122: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204522.99308: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204522.99348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204522.99440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204522.99702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204522.99735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204522.99772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204522.99875: variable '__network_is_ostree' from source: set_fact 44842 1727204522.99980: Evaluated conditional (not __network_is_ostree is defined): False 44842 1727204522.99988: when evaluation is False, skipping this task 44842 1727204522.99996: _execute() done 44842 1727204523.00004: dumping result to json 44842 1727204523.00012: done dumping result, returning 44842 1727204523.00024: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-aad0-d242-0000000005e0] 44842 1727204523.00035: sending task result for task 0affcd87-79f5-aad0-d242-0000000005e0 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44842 1727204523.00215: no more pending results, returning what we have 44842 1727204523.00220: results queue empty 44842 1727204523.00221: checking for any_errors_fatal 44842 1727204523.00229: done checking for any_errors_fatal 44842 1727204523.00230: checking for max_fail_percentage 44842 1727204523.00232: done checking for max_fail_percentage 44842 1727204523.00233: checking to see if all hosts have failed and the running result is not ok 44842 1727204523.00234: done checking to see if all hosts have failed 44842 1727204523.00235: getting the remaining hosts for this loop 44842 1727204523.00237: done getting the remaining hosts for this loop 44842 1727204523.00242: getting the next task for host managed-node1 44842 1727204523.00249: done getting next task for host managed-node1 44842 1727204523.00253: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44842 1727204523.00256: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204523.00273: getting variables 44842 1727204523.00276: in VariableManager get_vars() 44842 1727204523.00322: Calling all_inventory to load vars for managed-node1 44842 1727204523.00325: Calling groups_inventory to load vars for managed-node1 44842 1727204523.00328: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204523.00340: Calling all_plugins_play to load vars for managed-node1 44842 1727204523.00343: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204523.00346: Calling groups_plugins_play to load vars for managed-node1 44842 1727204523.00980: done sending task result for task 0affcd87-79f5-aad0-d242-0000000005e0 44842 1727204523.00984: WORKER PROCESS EXITING 44842 1727204523.03496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204523.09568: done with get_vars() 44842 1727204523.09615: done getting variables 44842 1727204523.09680: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:02:03 -0400 (0:00:00.151) 0:00:33.265 ***** 44842 1727204523.09714: entering _queue_task() for managed-node1/set_fact 44842 1727204523.10053: worker is 1 (out of 1 available) 44842 1727204523.10571: exiting _queue_task() for managed-node1/set_fact 44842 1727204523.10584: done queuing things up, now waiting for results queue to drain 44842 1727204523.10586: waiting for pending results... 44842 1727204523.10871: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44842 1727204523.11311: in run() - task 0affcd87-79f5-aad0-d242-0000000005e1 44842 1727204523.11444: variable 'ansible_search_path' from source: unknown 44842 1727204523.11451: variable 'ansible_search_path' from source: unknown 44842 1727204523.11493: calling self._execute() 44842 1727204523.11631: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204523.11763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204523.11779: variable 'omit' from source: magic vars 44842 1727204523.12530: variable 'ansible_distribution_major_version' from source: facts 44842 1727204523.12550: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204523.12909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204523.13547: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204523.13601: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204523.13708: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204523.13752: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204523.13912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204523.13997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204523.14074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204523.14183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204523.14398: variable '__network_is_ostree' from source: set_fact 44842 1727204523.14411: Evaluated conditional (not __network_is_ostree is defined): False 44842 1727204523.14419: when evaluation is False, skipping this task 44842 1727204523.14426: _execute() done 44842 1727204523.14432: dumping result to json 44842 1727204523.14440: done dumping result, returning 44842 1727204523.14451: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-aad0-d242-0000000005e1] 44842 1727204523.14460: sending task result for task 0affcd87-79f5-aad0-d242-0000000005e1 44842 1727204523.14687: done sending task result for task 0affcd87-79f5-aad0-d242-0000000005e1 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44842 1727204523.14741: no more pending results, returning what we have 44842 1727204523.14745: results queue empty 44842 1727204523.14747: checking for any_errors_fatal 44842 1727204523.14755: done checking for any_errors_fatal 44842 1727204523.14756: checking for max_fail_percentage 44842 1727204523.14758: done checking for max_fail_percentage 44842 1727204523.14759: checking to see if all hosts have failed and the running result is not ok 44842 1727204523.14760: done checking to see if all hosts have failed 44842 1727204523.14761: getting the remaining hosts for this loop 44842 1727204523.14763: done getting the remaining hosts for this loop 44842 1727204523.14769: getting the next task for host managed-node1 44842 1727204523.14779: done getting next task for host managed-node1 44842 1727204523.14784: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44842 1727204523.14787: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204523.14801: getting variables 44842 1727204523.14804: in VariableManager get_vars() 44842 1727204523.14849: Calling all_inventory to load vars for managed-node1 44842 1727204523.14852: Calling groups_inventory to load vars for managed-node1 44842 1727204523.14854: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204523.14867: Calling all_plugins_play to load vars for managed-node1 44842 1727204523.14871: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204523.14874: Calling groups_plugins_play to load vars for managed-node1 44842 1727204523.16074: WORKER PROCESS EXITING 44842 1727204523.17973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204523.20962: done with get_vars() 44842 1727204523.21809: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:02:03 -0400 (0:00:00.121) 0:00:33.387 ***** 44842 1727204523.21911: entering _queue_task() for managed-node1/service_facts 44842 1727204523.22248: worker is 1 (out of 1 available) 44842 1727204523.22259: exiting _queue_task() for managed-node1/service_facts 44842 1727204523.22272: done queuing things up, now waiting for results queue to drain 44842 1727204523.22274: waiting for pending results... 44842 1727204523.22836: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 44842 1727204523.22982: in run() - task 0affcd87-79f5-aad0-d242-0000000005e3 44842 1727204523.23011: variable 'ansible_search_path' from source: unknown 44842 1727204523.23018: variable 'ansible_search_path' from source: unknown 44842 1727204523.23058: calling self._execute() 44842 1727204523.23170: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204523.23183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204523.23199: variable 'omit' from source: magic vars 44842 1727204523.23608: variable 'ansible_distribution_major_version' from source: facts 44842 1727204523.23628: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204523.23642: variable 'omit' from source: magic vars 44842 1727204523.23712: variable 'omit' from source: magic vars 44842 1727204523.23754: variable 'omit' from source: magic vars 44842 1727204523.23806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204523.23845: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204523.23879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204523.23901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204523.23917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204523.23952: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204523.23961: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204523.23973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204523.24078: Set connection var ansible_shell_type to sh 44842 1727204523.24101: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204523.24111: Set connection var ansible_connection to ssh 44842 1727204523.24121: Set connection var ansible_pipelining to False 44842 1727204523.24131: Set connection var ansible_timeout to 10 44842 1727204523.24143: Set connection var ansible_shell_executable to /bin/sh 44842 1727204523.24172: variable 'ansible_shell_executable' from source: unknown 44842 1727204523.24180: variable 'ansible_connection' from source: unknown 44842 1727204523.24187: variable 'ansible_module_compression' from source: unknown 44842 1727204523.24200: variable 'ansible_shell_type' from source: unknown 44842 1727204523.24209: variable 'ansible_shell_executable' from source: unknown 44842 1727204523.24215: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204523.24222: variable 'ansible_pipelining' from source: unknown 44842 1727204523.24228: variable 'ansible_timeout' from source: unknown 44842 1727204523.24234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204523.24443: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204523.24459: variable 'omit' from source: magic vars 44842 1727204523.24471: starting attempt loop 44842 1727204523.24478: running the handler 44842 1727204523.24496: _low_level_execute_command(): starting 44842 1727204523.24508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204523.25302: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204523.25318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.25334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.25355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.25401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.25420: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204523.25436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.25456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204523.25471: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204523.25484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204523.25498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.25515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.25536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.25550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.25566: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204523.25582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.25667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204523.25685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204523.25700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204523.25826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204523.27501: stdout chunk (state=3): >>>/root <<< 44842 1727204523.27699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204523.27703: stdout chunk (state=3): >>><<< 44842 1727204523.27706: stderr chunk (state=3): >>><<< 44842 1727204523.27819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204523.27823: _low_level_execute_command(): starting 44842 1727204523.27827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532 `" && echo ansible-tmp-1727204523.2772653-47385-197515989121532="` echo /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532 `" ) && sleep 0' 44842 1727204523.29871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.29880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.30920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.30925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.30988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204523.31488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204523.31492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204523.31567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204523.33477: stdout chunk (state=3): >>>ansible-tmp-1727204523.2772653-47385-197515989121532=/root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532 <<< 44842 1727204523.33696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204523.33700: stdout chunk (state=3): >>><<< 44842 1727204523.33703: stderr chunk (state=3): >>><<< 44842 1727204523.34027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204523.2772653-47385-197515989121532=/root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204523.34031: variable 'ansible_module_compression' from source: unknown 44842 1727204523.34034: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44842 1727204523.34036: variable 'ansible_facts' from source: unknown 44842 1727204523.34039: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532/AnsiballZ_service_facts.py 44842 1727204523.34732: Sending initial data 44842 1727204523.34736: Sent initial data (162 bytes) 44842 1727204523.36548: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204523.37190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.37209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.37228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.37287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.37300: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204523.37316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.37334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204523.37346: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204523.37358: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204523.37377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.37391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.37406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.37418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.37429: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204523.37442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.37524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204523.37549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204523.37572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204523.37668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204523.39409: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204523.39456: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204523.39510: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmps_na_mit /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532/AnsiballZ_service_facts.py <<< 44842 1727204523.39558: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204523.41087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204523.41214: stderr chunk (state=3): >>><<< 44842 1727204523.41218: stdout chunk (state=3): >>><<< 44842 1727204523.41220: done transferring module to remote 44842 1727204523.41222: _low_level_execute_command(): starting 44842 1727204523.41224: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532/ /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532/AnsiballZ_service_facts.py && sleep 0' 44842 1727204523.42651: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204523.42851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.42876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.42900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.42945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.42958: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204523.42982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.43000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204523.43017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204523.43028: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204523.43040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.43053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.43074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.43086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.43097: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204523.43109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.43189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204523.43213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204523.43232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204523.43322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204523.45105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204523.45109: stderr chunk (state=3): >>><<< 44842 1727204523.45111: stdout chunk (state=3): >>><<< 44842 1727204523.45131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204523.45136: _low_level_execute_command(): starting 44842 1727204523.45138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532/AnsiballZ_service_facts.py && sleep 0' 44842 1727204523.46371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204523.46388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.46406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.46432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.46482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.46494: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204523.46508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.46529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204523.46544: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204523.46555: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204523.46573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204523.46587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204523.46602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204523.46614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204523.46628: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204523.46642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204523.46721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204523.46748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204523.46775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204523.46881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204524.75481: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state<<< 44842 1727204524.75540: stdout chunk (state=3): >>>": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44842 1727204524.76832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204524.76948: stderr chunk (state=3): >>><<< 44842 1727204524.76951: stdout chunk (state=3): >>><<< 44842 1727204524.77273: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204524.77750: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204524.77772: _low_level_execute_command(): starting 44842 1727204524.77783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204523.2772653-47385-197515989121532/ > /dev/null 2>&1 && sleep 0' 44842 1727204524.79707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204524.79722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204524.79736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204524.79755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204524.79807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204524.79820: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204524.79835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204524.79854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204524.79873: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204524.79993: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204524.80009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204524.80026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204524.80046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204524.80062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204524.80077: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204524.80089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204524.80168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204524.80189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204524.80202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204524.80291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204524.82188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204524.82192: stdout chunk (state=3): >>><<< 44842 1727204524.82198: stderr chunk (state=3): >>><<< 44842 1727204524.82223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204524.82229: handler run complete 44842 1727204524.82415: variable 'ansible_facts' from source: unknown 44842 1727204524.82551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204524.83213: variable 'ansible_facts' from source: unknown 44842 1727204524.83334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204524.83527: attempt loop complete, returning result 44842 1727204524.83533: _execute() done 44842 1727204524.83535: dumping result to json 44842 1727204524.83629: done dumping result, returning 44842 1727204524.83632: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-aad0-d242-0000000005e3] 44842 1727204524.83635: sending task result for task 0affcd87-79f5-aad0-d242-0000000005e3 44842 1727204524.85086: done sending task result for task 0affcd87-79f5-aad0-d242-0000000005e3 44842 1727204524.85090: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204524.85165: no more pending results, returning what we have 44842 1727204524.85168: results queue empty 44842 1727204524.85169: checking for any_errors_fatal 44842 1727204524.85173: done checking for any_errors_fatal 44842 1727204524.85174: checking for max_fail_percentage 44842 1727204524.85175: done checking for max_fail_percentage 44842 1727204524.85176: checking to see if all hosts have failed and the running result is not ok 44842 1727204524.85177: done checking to see if all hosts have failed 44842 1727204524.85178: getting the remaining hosts for this loop 44842 1727204524.85179: done getting the remaining hosts for this loop 44842 1727204524.85183: getting the next task for host managed-node1 44842 1727204524.85188: done getting next task for host managed-node1 44842 1727204524.85192: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44842 1727204524.85195: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204524.85204: getting variables 44842 1727204524.85206: in VariableManager get_vars() 44842 1727204524.85239: Calling all_inventory to load vars for managed-node1 44842 1727204524.85242: Calling groups_inventory to load vars for managed-node1 44842 1727204524.85245: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204524.85254: Calling all_plugins_play to load vars for managed-node1 44842 1727204524.85257: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204524.85263: Calling groups_plugins_play to load vars for managed-node1 44842 1727204524.87031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204524.89066: done with get_vars() 44842 1727204524.89109: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:02:04 -0400 (0:00:01.673) 0:00:35.060 ***** 44842 1727204524.89224: entering _queue_task() for managed-node1/package_facts 44842 1727204524.89605: worker is 1 (out of 1 available) 44842 1727204524.89616: exiting _queue_task() for managed-node1/package_facts 44842 1727204524.89627: done queuing things up, now waiting for results queue to drain 44842 1727204524.89628: waiting for pending results... 44842 1727204524.89950: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 44842 1727204524.90099: in run() - task 0affcd87-79f5-aad0-d242-0000000005e4 44842 1727204524.90113: variable 'ansible_search_path' from source: unknown 44842 1727204524.90118: variable 'ansible_search_path' from source: unknown 44842 1727204524.90153: calling self._execute() 44842 1727204524.90271: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204524.90276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204524.90291: variable 'omit' from source: magic vars 44842 1727204524.90733: variable 'ansible_distribution_major_version' from source: facts 44842 1727204524.90753: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204524.90759: variable 'omit' from source: magic vars 44842 1727204524.90824: variable 'omit' from source: magic vars 44842 1727204524.90874: variable 'omit' from source: magic vars 44842 1727204524.90918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204524.90963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204524.90992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204524.91010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204524.91021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204524.91067: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204524.91073: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204524.91076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204524.91188: Set connection var ansible_shell_type to sh 44842 1727204524.91199: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204524.91204: Set connection var ansible_connection to ssh 44842 1727204524.91210: Set connection var ansible_pipelining to False 44842 1727204524.91216: Set connection var ansible_timeout to 10 44842 1727204524.91224: Set connection var ansible_shell_executable to /bin/sh 44842 1727204524.91248: variable 'ansible_shell_executable' from source: unknown 44842 1727204524.91252: variable 'ansible_connection' from source: unknown 44842 1727204524.91254: variable 'ansible_module_compression' from source: unknown 44842 1727204524.91257: variable 'ansible_shell_type' from source: unknown 44842 1727204524.91259: variable 'ansible_shell_executable' from source: unknown 44842 1727204524.91261: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204524.91272: variable 'ansible_pipelining' from source: unknown 44842 1727204524.91276: variable 'ansible_timeout' from source: unknown 44842 1727204524.91284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204524.91536: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204524.91547: variable 'omit' from source: magic vars 44842 1727204524.91552: starting attempt loop 44842 1727204524.91557: running the handler 44842 1727204524.91575: _low_level_execute_command(): starting 44842 1727204524.91583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204524.92446: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204524.92459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204524.92476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204524.92491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204524.92543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204524.92552: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204524.92559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204524.92579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204524.92587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204524.92599: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204524.92602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204524.92619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204524.92636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204524.92644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204524.92652: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204524.92662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204524.92747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204524.92775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204524.92784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204524.92876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204524.94507: stdout chunk (state=3): >>>/root <<< 44842 1727204524.94684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204524.94710: stderr chunk (state=3): >>><<< 44842 1727204524.94713: stdout chunk (state=3): >>><<< 44842 1727204524.94738: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204524.94752: _low_level_execute_command(): starting 44842 1727204524.94758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268 `" && echo ansible-tmp-1727204524.9473805-47442-70119093057268="` echo /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268 `" ) && sleep 0' 44842 1727204524.95447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204524.95456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204524.95472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204524.95486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204524.95530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204524.95537: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204524.95546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204524.95559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204524.95572: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204524.95579: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204524.95587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204524.95596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204524.95614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204524.95621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204524.95627: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204524.95636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204524.95714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204524.95733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204524.95743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204524.95828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204524.97683: stdout chunk (state=3): >>>ansible-tmp-1727204524.9473805-47442-70119093057268=/root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268 <<< 44842 1727204524.97891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204524.97895: stdout chunk (state=3): >>><<< 44842 1727204524.97897: stderr chunk (state=3): >>><<< 44842 1727204524.97972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204524.9473805-47442-70119093057268=/root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204524.97976: variable 'ansible_module_compression' from source: unknown 44842 1727204524.98272: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44842 1727204524.98276: variable 'ansible_facts' from source: unknown 44842 1727204524.98284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268/AnsiballZ_package_facts.py 44842 1727204524.98845: Sending initial data 44842 1727204524.98849: Sent initial data (161 bytes) 44842 1727204525.00392: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204525.00406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204525.00420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204525.00437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204525.00490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204525.00503: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204525.00516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.00532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204525.00543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204525.00553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204525.00569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204525.00584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204525.00604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204525.00615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204525.00632: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204525.00651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.00733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204525.00756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204525.00776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204525.00861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204525.02576: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204525.02624: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204525.02683: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp904jy9h8 /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268/AnsiballZ_package_facts.py <<< 44842 1727204525.02736: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204525.05340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204525.05656: stderr chunk (state=3): >>><<< 44842 1727204525.05675: stdout chunk (state=3): >>><<< 44842 1727204525.05682: done transferring module to remote 44842 1727204525.05691: _low_level_execute_command(): starting 44842 1727204525.05695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268/ /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268/AnsiballZ_package_facts.py && sleep 0' 44842 1727204525.07497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204525.07511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204525.07525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204525.07543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204525.07593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204525.07605: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204525.07619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.07643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204525.07657: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204525.07678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204525.07692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204525.07705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204525.07719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204525.07730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204525.07744: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204525.07757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.07846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204525.07870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204525.07884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204525.08145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204525.09981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204525.10031: stderr chunk (state=3): >>><<< 44842 1727204525.10035: stdout chunk (state=3): >>><<< 44842 1727204525.10129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204525.10133: _low_level_execute_command(): starting 44842 1727204525.10136: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268/AnsiballZ_package_facts.py && sleep 0' 44842 1727204525.11034: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204525.11038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204525.11071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.11086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.11154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204525.11158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204525.11239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204525.57704: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 44842 1727204525.57813: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 44842 1727204525.57923: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 44842 1727204525.57944: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44842 1727204525.59391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204525.59455: stderr chunk (state=3): >>><<< 44842 1727204525.59458: stdout chunk (state=3): >>><<< 44842 1727204525.59532: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204525.65976: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204525.66007: _low_level_execute_command(): starting 44842 1727204525.66017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204524.9473805-47442-70119093057268/ > /dev/null 2>&1 && sleep 0' 44842 1727204525.66552: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204525.66573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204525.66592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204525.66616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204525.66679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204525.66695: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204525.66712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.66729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204525.66738: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204525.66744: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204525.66769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204525.66787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204525.66804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204525.66815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204525.66875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204525.66883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204525.66957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204525.68788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204525.68856: stderr chunk (state=3): >>><<< 44842 1727204525.68860: stdout chunk (state=3): >>><<< 44842 1727204525.68888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204525.68894: handler run complete 44842 1727204525.69486: variable 'ansible_facts' from source: unknown 44842 1727204525.69899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204525.71212: variable 'ansible_facts' from source: unknown 44842 1727204525.71474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204525.72096: attempt loop complete, returning result 44842 1727204525.72112: _execute() done 44842 1727204525.72115: dumping result to json 44842 1727204525.72266: done dumping result, returning 44842 1727204525.72272: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-aad0-d242-0000000005e4] 44842 1727204525.72277: sending task result for task 0affcd87-79f5-aad0-d242-0000000005e4 44842 1727204525.80435: done sending task result for task 0affcd87-79f5-aad0-d242-0000000005e4 44842 1727204525.80440: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204525.80648: no more pending results, returning what we have 44842 1727204525.80651: results queue empty 44842 1727204525.80652: checking for any_errors_fatal 44842 1727204525.80656: done checking for any_errors_fatal 44842 1727204525.80656: checking for max_fail_percentage 44842 1727204525.80657: done checking for max_fail_percentage 44842 1727204525.80658: checking to see if all hosts have failed and the running result is not ok 44842 1727204525.80659: done checking to see if all hosts have failed 44842 1727204525.80660: getting the remaining hosts for this loop 44842 1727204525.80661: done getting the remaining hosts for this loop 44842 1727204525.80665: getting the next task for host managed-node1 44842 1727204525.80670: done getting next task for host managed-node1 44842 1727204525.80672: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44842 1727204525.80674: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204525.80682: getting variables 44842 1727204525.80683: in VariableManager get_vars() 44842 1727204525.80704: Calling all_inventory to load vars for managed-node1 44842 1727204525.80707: Calling groups_inventory to load vars for managed-node1 44842 1727204525.80709: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204525.80715: Calling all_plugins_play to load vars for managed-node1 44842 1727204525.80717: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204525.80720: Calling groups_plugins_play to load vars for managed-node1 44842 1727204525.82594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204525.85689: done with get_vars() 44842 1727204525.85720: done getting variables 44842 1727204525.85782: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.965) 0:00:36.026 ***** 44842 1727204525.85815: entering _queue_task() for managed-node1/debug 44842 1727204525.86397: worker is 1 (out of 1 available) 44842 1727204525.86413: exiting _queue_task() for managed-node1/debug 44842 1727204525.86425: done queuing things up, now waiting for results queue to drain 44842 1727204525.86426: waiting for pending results... 44842 1727204525.86934: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 44842 1727204525.87067: in run() - task 0affcd87-79f5-aad0-d242-000000000092 44842 1727204525.87081: variable 'ansible_search_path' from source: unknown 44842 1727204525.87085: variable 'ansible_search_path' from source: unknown 44842 1727204525.87286: calling self._execute() 44842 1727204525.87605: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204525.87609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204525.87619: variable 'omit' from source: magic vars 44842 1727204525.88036: variable 'ansible_distribution_major_version' from source: facts 44842 1727204525.88050: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204525.88056: variable 'omit' from source: magic vars 44842 1727204525.88100: variable 'omit' from source: magic vars 44842 1727204525.88215: variable 'network_provider' from source: set_fact 44842 1727204525.88268: variable 'omit' from source: magic vars 44842 1727204525.88311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204525.88346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204525.88422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204525.88439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204525.88486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204525.88515: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204525.88526: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204525.88576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204525.88890: Set connection var ansible_shell_type to sh 44842 1727204525.88906: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204525.88911: Set connection var ansible_connection to ssh 44842 1727204525.88918: Set connection var ansible_pipelining to False 44842 1727204525.88949: Set connection var ansible_timeout to 10 44842 1727204525.88956: Set connection var ansible_shell_executable to /bin/sh 44842 1727204525.88986: variable 'ansible_shell_executable' from source: unknown 44842 1727204525.88990: variable 'ansible_connection' from source: unknown 44842 1727204525.88992: variable 'ansible_module_compression' from source: unknown 44842 1727204525.88995: variable 'ansible_shell_type' from source: unknown 44842 1727204525.88997: variable 'ansible_shell_executable' from source: unknown 44842 1727204525.88999: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204525.89001: variable 'ansible_pipelining' from source: unknown 44842 1727204525.89011: variable 'ansible_timeout' from source: unknown 44842 1727204525.89015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204525.89394: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204525.89431: variable 'omit' from source: magic vars 44842 1727204525.89437: starting attempt loop 44842 1727204525.89440: running the handler 44842 1727204525.89527: handler run complete 44842 1727204525.89543: attempt loop complete, returning result 44842 1727204525.89546: _execute() done 44842 1727204525.89549: dumping result to json 44842 1727204525.89551: done dumping result, returning 44842 1727204525.89567: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-aad0-d242-000000000092] 44842 1727204525.89577: sending task result for task 0affcd87-79f5-aad0-d242-000000000092 44842 1727204525.89718: done sending task result for task 0affcd87-79f5-aad0-d242-000000000092 44842 1727204525.89721: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 44842 1727204525.89848: no more pending results, returning what we have 44842 1727204525.89852: results queue empty 44842 1727204525.89852: checking for any_errors_fatal 44842 1727204525.89868: done checking for any_errors_fatal 44842 1727204525.89869: checking for max_fail_percentage 44842 1727204525.89870: done checking for max_fail_percentage 44842 1727204525.89871: checking to see if all hosts have failed and the running result is not ok 44842 1727204525.89872: done checking to see if all hosts have failed 44842 1727204525.89873: getting the remaining hosts for this loop 44842 1727204525.89875: done getting the remaining hosts for this loop 44842 1727204525.89879: getting the next task for host managed-node1 44842 1727204525.89885: done getting next task for host managed-node1 44842 1727204525.89889: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44842 1727204525.89890: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204525.89899: getting variables 44842 1727204525.89901: in VariableManager get_vars() 44842 1727204525.89941: Calling all_inventory to load vars for managed-node1 44842 1727204525.89944: Calling groups_inventory to load vars for managed-node1 44842 1727204525.89947: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204525.89960: Calling all_plugins_play to load vars for managed-node1 44842 1727204525.89962: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204525.89968: Calling groups_plugins_play to load vars for managed-node1 44842 1727204525.94836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204525.99030: done with get_vars() 44842 1727204525.99077: done getting variables 44842 1727204525.99149: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:02:05 -0400 (0:00:00.133) 0:00:36.159 ***** 44842 1727204525.99199: entering _queue_task() for managed-node1/fail 44842 1727204525.99549: worker is 1 (out of 1 available) 44842 1727204525.99566: exiting _queue_task() for managed-node1/fail 44842 1727204525.99584: done queuing things up, now waiting for results queue to drain 44842 1727204525.99589: waiting for pending results... 44842 1727204525.99950: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44842 1727204526.00086: in run() - task 0affcd87-79f5-aad0-d242-000000000093 44842 1727204526.00109: variable 'ansible_search_path' from source: unknown 44842 1727204526.00112: variable 'ansible_search_path' from source: unknown 44842 1727204526.00168: calling self._execute() 44842 1727204526.00282: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.00286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.00297: variable 'omit' from source: magic vars 44842 1727204526.01169: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.01173: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.01177: variable 'network_state' from source: role '' defaults 44842 1727204526.01179: Evaluated conditional (network_state != {}): False 44842 1727204526.01181: when evaluation is False, skipping this task 44842 1727204526.01184: _execute() done 44842 1727204526.01186: dumping result to json 44842 1727204526.01189: done dumping result, returning 44842 1727204526.01191: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-aad0-d242-000000000093] 44842 1727204526.01192: sending task result for task 0affcd87-79f5-aad0-d242-000000000093 44842 1727204526.01266: done sending task result for task 0affcd87-79f5-aad0-d242-000000000093 44842 1727204526.01272: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204526.01322: no more pending results, returning what we have 44842 1727204526.01326: results queue empty 44842 1727204526.01327: checking for any_errors_fatal 44842 1727204526.01335: done checking for any_errors_fatal 44842 1727204526.01335: checking for max_fail_percentage 44842 1727204526.01338: done checking for max_fail_percentage 44842 1727204526.01339: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.01341: done checking to see if all hosts have failed 44842 1727204526.01342: getting the remaining hosts for this loop 44842 1727204526.01344: done getting the remaining hosts for this loop 44842 1727204526.01348: getting the next task for host managed-node1 44842 1727204526.01356: done getting next task for host managed-node1 44842 1727204526.01360: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44842 1727204526.01363: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.01381: getting variables 44842 1727204526.01383: in VariableManager get_vars() 44842 1727204526.01425: Calling all_inventory to load vars for managed-node1 44842 1727204526.01428: Calling groups_inventory to load vars for managed-node1 44842 1727204526.01430: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.01442: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.01445: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.01448: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.05440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.09506: done with get_vars() 44842 1727204526.09543: done getting variables 44842 1727204526.09617: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.104) 0:00:36.264 ***** 44842 1727204526.09652: entering _queue_task() for managed-node1/fail 44842 1727204526.10050: worker is 1 (out of 1 available) 44842 1727204526.10071: exiting _queue_task() for managed-node1/fail 44842 1727204526.10083: done queuing things up, now waiting for results queue to drain 44842 1727204526.10084: waiting for pending results... 44842 1727204526.10413: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44842 1727204526.10671: in run() - task 0affcd87-79f5-aad0-d242-000000000094 44842 1727204526.10709: variable 'ansible_search_path' from source: unknown 44842 1727204526.10713: variable 'ansible_search_path' from source: unknown 44842 1727204526.10796: calling self._execute() 44842 1727204526.11014: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.11018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.11087: variable 'omit' from source: magic vars 44842 1727204526.12057: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.12185: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.12510: variable 'network_state' from source: role '' defaults 44842 1727204526.12526: Evaluated conditional (network_state != {}): False 44842 1727204526.12530: when evaluation is False, skipping this task 44842 1727204526.12538: _execute() done 44842 1727204526.12541: dumping result to json 44842 1727204526.12551: done dumping result, returning 44842 1727204526.12562: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-aad0-d242-000000000094] 44842 1727204526.12567: sending task result for task 0affcd87-79f5-aad0-d242-000000000094 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204526.12719: no more pending results, returning what we have 44842 1727204526.12723: results queue empty 44842 1727204526.12724: checking for any_errors_fatal 44842 1727204526.12734: done checking for any_errors_fatal 44842 1727204526.12734: checking for max_fail_percentage 44842 1727204526.12736: done checking for max_fail_percentage 44842 1727204526.12737: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.12738: done checking to see if all hosts have failed 44842 1727204526.12739: getting the remaining hosts for this loop 44842 1727204526.12741: done getting the remaining hosts for this loop 44842 1727204526.12745: getting the next task for host managed-node1 44842 1727204526.12752: done getting next task for host managed-node1 44842 1727204526.12755: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44842 1727204526.12760: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.12780: getting variables 44842 1727204526.12783: in VariableManager get_vars() 44842 1727204526.12825: Calling all_inventory to load vars for managed-node1 44842 1727204526.12828: Calling groups_inventory to load vars for managed-node1 44842 1727204526.12831: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.12846: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.12849: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.12852: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.13369: done sending task result for task 0affcd87-79f5-aad0-d242-000000000094 44842 1727204526.13373: WORKER PROCESS EXITING 44842 1727204526.15401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.18067: done with get_vars() 44842 1727204526.18131: done getting variables 44842 1727204526.19043: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.094) 0:00:36.358 ***** 44842 1727204526.19082: entering _queue_task() for managed-node1/fail 44842 1727204526.19889: worker is 1 (out of 1 available) 44842 1727204526.19903: exiting _queue_task() for managed-node1/fail 44842 1727204526.19915: done queuing things up, now waiting for results queue to drain 44842 1727204526.19917: waiting for pending results... 44842 1727204526.20831: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44842 1727204526.21051: in run() - task 0affcd87-79f5-aad0-d242-000000000095 44842 1727204526.21069: variable 'ansible_search_path' from source: unknown 44842 1727204526.21073: variable 'ansible_search_path' from source: unknown 44842 1727204526.21226: calling self._execute() 44842 1727204526.21446: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.21450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.21466: variable 'omit' from source: magic vars 44842 1727204526.22331: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.22345: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.22787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204526.27908: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204526.28103: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204526.28142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204526.28325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204526.28353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204526.28578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.28667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.28697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.28873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.28939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.29123: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.29259: Evaluated conditional (ansible_distribution_major_version | int > 9): False 44842 1727204526.29267: when evaluation is False, skipping this task 44842 1727204526.29271: _execute() done 44842 1727204526.29273: dumping result to json 44842 1727204526.29276: done dumping result, returning 44842 1727204526.29285: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-aad0-d242-000000000095] 44842 1727204526.29291: sending task result for task 0affcd87-79f5-aad0-d242-000000000095 44842 1727204526.29396: done sending task result for task 0affcd87-79f5-aad0-d242-000000000095 44842 1727204526.29400: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 44842 1727204526.29451: no more pending results, returning what we have 44842 1727204526.29458: results queue empty 44842 1727204526.29459: checking for any_errors_fatal 44842 1727204526.29468: done checking for any_errors_fatal 44842 1727204526.29469: checking for max_fail_percentage 44842 1727204526.29471: done checking for max_fail_percentage 44842 1727204526.29472: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.29473: done checking to see if all hosts have failed 44842 1727204526.29474: getting the remaining hosts for this loop 44842 1727204526.29476: done getting the remaining hosts for this loop 44842 1727204526.29480: getting the next task for host managed-node1 44842 1727204526.29487: done getting next task for host managed-node1 44842 1727204526.29493: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44842 1727204526.29495: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.29511: getting variables 44842 1727204526.29513: in VariableManager get_vars() 44842 1727204526.29557: Calling all_inventory to load vars for managed-node1 44842 1727204526.29560: Calling groups_inventory to load vars for managed-node1 44842 1727204526.29563: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.29575: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.29578: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.29581: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.32790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.37081: done with get_vars() 44842 1727204526.37116: done getting variables 44842 1727204526.37305: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.182) 0:00:36.541 ***** 44842 1727204526.37339: entering _queue_task() for managed-node1/dnf 44842 1727204526.38024: worker is 1 (out of 1 available) 44842 1727204526.38151: exiting _queue_task() for managed-node1/dnf 44842 1727204526.38169: done queuing things up, now waiting for results queue to drain 44842 1727204526.38171: waiting for pending results... 44842 1727204526.39239: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44842 1727204526.39464: in run() - task 0affcd87-79f5-aad0-d242-000000000096 44842 1727204526.39592: variable 'ansible_search_path' from source: unknown 44842 1727204526.39597: variable 'ansible_search_path' from source: unknown 44842 1727204526.39643: calling self._execute() 44842 1727204526.39874: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.39878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.39890: variable 'omit' from source: magic vars 44842 1727204526.40368: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.40382: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.40692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204526.45828: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204526.45908: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204526.45944: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204526.45985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204526.46018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204526.46106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.46568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.46596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.46641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.46671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.46804: variable 'ansible_distribution' from source: facts 44842 1727204526.46808: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.46825: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44842 1727204526.46963: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204526.48152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.48531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.48570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.48610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.48647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.48751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.48873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.48876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.48942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.48969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.49026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.49053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.49086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.49133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.49152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.49349: variable 'network_connections' from source: play vars 44842 1727204526.49370: variable 'profile' from source: play vars 44842 1727204526.49453: variable 'profile' from source: play vars 44842 1727204526.49465: variable 'interface' from source: set_fact 44842 1727204526.49535: variable 'interface' from source: set_fact 44842 1727204526.49609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204526.49796: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204526.49840: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204526.49883: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204526.49918: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204526.49970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204526.50000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204526.50040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.50078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204526.50131: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204526.50341: variable 'network_connections' from source: play vars 44842 1727204526.50344: variable 'profile' from source: play vars 44842 1727204526.50388: variable 'profile' from source: play vars 44842 1727204526.50393: variable 'interface' from source: set_fact 44842 1727204526.50434: variable 'interface' from source: set_fact 44842 1727204526.50456: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204526.50459: when evaluation is False, skipping this task 44842 1727204526.50464: _execute() done 44842 1727204526.50468: dumping result to json 44842 1727204526.50470: done dumping result, returning 44842 1727204526.50472: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-000000000096] 44842 1727204526.50481: sending task result for task 0affcd87-79f5-aad0-d242-000000000096 44842 1727204526.50682: done sending task result for task 0affcd87-79f5-aad0-d242-000000000096 44842 1727204526.50686: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204526.50743: no more pending results, returning what we have 44842 1727204526.50747: results queue empty 44842 1727204526.50748: checking for any_errors_fatal 44842 1727204526.50754: done checking for any_errors_fatal 44842 1727204526.50755: checking for max_fail_percentage 44842 1727204526.50757: done checking for max_fail_percentage 44842 1727204526.50757: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.50758: done checking to see if all hosts have failed 44842 1727204526.50759: getting the remaining hosts for this loop 44842 1727204526.50763: done getting the remaining hosts for this loop 44842 1727204526.50947: getting the next task for host managed-node1 44842 1727204526.50954: done getting next task for host managed-node1 44842 1727204526.50958: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44842 1727204526.50960: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.50974: getting variables 44842 1727204526.50976: in VariableManager get_vars() 44842 1727204526.51012: Calling all_inventory to load vars for managed-node1 44842 1727204526.51015: Calling groups_inventory to load vars for managed-node1 44842 1727204526.51017: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.51028: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.51030: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.51033: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.52767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.55253: done with get_vars() 44842 1727204526.55293: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44842 1727204526.55397: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.180) 0:00:36.722 ***** 44842 1727204526.55428: entering _queue_task() for managed-node1/yum 44842 1727204526.55794: worker is 1 (out of 1 available) 44842 1727204526.55806: exiting _queue_task() for managed-node1/yum 44842 1727204526.55819: done queuing things up, now waiting for results queue to drain 44842 1727204526.55821: waiting for pending results... 44842 1727204526.56154: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44842 1727204526.56293: in run() - task 0affcd87-79f5-aad0-d242-000000000097 44842 1727204526.56325: variable 'ansible_search_path' from source: unknown 44842 1727204526.56339: variable 'ansible_search_path' from source: unknown 44842 1727204526.56393: calling self._execute() 44842 1727204526.56616: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.56626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.56638: variable 'omit' from source: magic vars 44842 1727204526.57103: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.57130: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.57352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204526.59353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204526.59408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204526.59438: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204526.59466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204526.59488: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204526.59547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.59582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.59602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.59635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.59644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.59720: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.59736: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44842 1727204526.59740: when evaluation is False, skipping this task 44842 1727204526.59744: _execute() done 44842 1727204526.59746: dumping result to json 44842 1727204526.59749: done dumping result, returning 44842 1727204526.59754: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-000000000097] 44842 1727204526.59759: sending task result for task 0affcd87-79f5-aad0-d242-000000000097 44842 1727204526.59851: done sending task result for task 0affcd87-79f5-aad0-d242-000000000097 44842 1727204526.59854: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44842 1727204526.59908: no more pending results, returning what we have 44842 1727204526.59912: results queue empty 44842 1727204526.59913: checking for any_errors_fatal 44842 1727204526.59919: done checking for any_errors_fatal 44842 1727204526.59920: checking for max_fail_percentage 44842 1727204526.59921: done checking for max_fail_percentage 44842 1727204526.59922: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.59923: done checking to see if all hosts have failed 44842 1727204526.59924: getting the remaining hosts for this loop 44842 1727204526.59926: done getting the remaining hosts for this loop 44842 1727204526.59930: getting the next task for host managed-node1 44842 1727204526.59935: done getting next task for host managed-node1 44842 1727204526.59939: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44842 1727204526.59941: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.59954: getting variables 44842 1727204526.59956: in VariableManager get_vars() 44842 1727204526.59996: Calling all_inventory to load vars for managed-node1 44842 1727204526.59999: Calling groups_inventory to load vars for managed-node1 44842 1727204526.60001: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.60010: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.60012: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.60014: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.60908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.62542: done with get_vars() 44842 1727204526.62578: done getting variables 44842 1727204526.62666: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.072) 0:00:36.795 ***** 44842 1727204526.62696: entering _queue_task() for managed-node1/fail 44842 1727204526.63023: worker is 1 (out of 1 available) 44842 1727204526.63038: exiting _queue_task() for managed-node1/fail 44842 1727204526.63050: done queuing things up, now waiting for results queue to drain 44842 1727204526.63051: waiting for pending results... 44842 1727204526.63266: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44842 1727204526.63349: in run() - task 0affcd87-79f5-aad0-d242-000000000098 44842 1727204526.63422: variable 'ansible_search_path' from source: unknown 44842 1727204526.63427: variable 'ansible_search_path' from source: unknown 44842 1727204526.63431: calling self._execute() 44842 1727204526.63567: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.63571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.63575: variable 'omit' from source: magic vars 44842 1727204526.63984: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.63995: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.64117: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204526.64353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204526.67015: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204526.67069: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204526.67113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204526.67151: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204526.67180: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204526.67272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.67290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.67327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.67373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.67401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.67440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.67472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.67495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.67520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.67531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.67563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.67582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.67600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.67623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.67633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.67777: variable 'network_connections' from source: play vars 44842 1727204526.67792: variable 'profile' from source: play vars 44842 1727204526.67883: variable 'profile' from source: play vars 44842 1727204526.67886: variable 'interface' from source: set_fact 44842 1727204526.67954: variable 'interface' from source: set_fact 44842 1727204526.68010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204526.68157: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204526.68187: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204526.68209: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204526.68235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204526.68287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204526.68303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204526.68320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.68339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204526.68380: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204526.68554: variable 'network_connections' from source: play vars 44842 1727204526.68559: variable 'profile' from source: play vars 44842 1727204526.68607: variable 'profile' from source: play vars 44842 1727204526.68610: variable 'interface' from source: set_fact 44842 1727204526.68676: variable 'interface' from source: set_fact 44842 1727204526.68698: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204526.68701: when evaluation is False, skipping this task 44842 1727204526.68704: _execute() done 44842 1727204526.68706: dumping result to json 44842 1727204526.68709: done dumping result, returning 44842 1727204526.68723: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-000000000098] 44842 1727204526.68733: sending task result for task 0affcd87-79f5-aad0-d242-000000000098 44842 1727204526.68845: done sending task result for task 0affcd87-79f5-aad0-d242-000000000098 44842 1727204526.68848: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204526.68902: no more pending results, returning what we have 44842 1727204526.68905: results queue empty 44842 1727204526.68906: checking for any_errors_fatal 44842 1727204526.68912: done checking for any_errors_fatal 44842 1727204526.68913: checking for max_fail_percentage 44842 1727204526.68915: done checking for max_fail_percentage 44842 1727204526.68916: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.68917: done checking to see if all hosts have failed 44842 1727204526.68917: getting the remaining hosts for this loop 44842 1727204526.68919: done getting the remaining hosts for this loop 44842 1727204526.68923: getting the next task for host managed-node1 44842 1727204526.68929: done getting next task for host managed-node1 44842 1727204526.68933: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44842 1727204526.68934: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.68948: getting variables 44842 1727204526.68949: in VariableManager get_vars() 44842 1727204526.68999: Calling all_inventory to load vars for managed-node1 44842 1727204526.69002: Calling groups_inventory to load vars for managed-node1 44842 1727204526.69004: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.69012: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.69014: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.69017: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.70033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.71017: done with get_vars() 44842 1727204526.71035: done getting variables 44842 1727204526.71090: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.084) 0:00:36.879 ***** 44842 1727204526.71114: entering _queue_task() for managed-node1/package 44842 1727204526.71358: worker is 1 (out of 1 available) 44842 1727204526.71378: exiting _queue_task() for managed-node1/package 44842 1727204526.71390: done queuing things up, now waiting for results queue to drain 44842 1727204526.71392: waiting for pending results... 44842 1727204526.71573: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 44842 1727204526.71647: in run() - task 0affcd87-79f5-aad0-d242-000000000099 44842 1727204526.71658: variable 'ansible_search_path' from source: unknown 44842 1727204526.71660: variable 'ansible_search_path' from source: unknown 44842 1727204526.71695: calling self._execute() 44842 1727204526.71775: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.71778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.71787: variable 'omit' from source: magic vars 44842 1727204526.72078: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.72089: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.72232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204526.72432: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204526.72476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204526.72510: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204526.72643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204526.72815: variable 'network_packages' from source: role '' defaults 44842 1727204526.72923: variable '__network_provider_setup' from source: role '' defaults 44842 1727204526.72956: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204526.73035: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204526.73040: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204526.73121: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204526.73343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204526.76263: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204526.76347: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204526.76380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204526.76405: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204526.76428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204526.76493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.76512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.76534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.76562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.76582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.76661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.76719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.76761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.76820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.76830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.77015: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44842 1727204526.77180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.77204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.77222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.77249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.77259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.77344: variable 'ansible_python' from source: facts 44842 1727204526.77367: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44842 1727204526.77433: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204526.77508: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204526.77649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.77684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.77723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.77827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.78172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.78210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204526.78233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204526.78287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.78305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204526.78320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204526.78468: variable 'network_connections' from source: play vars 44842 1727204526.78474: variable 'profile' from source: play vars 44842 1727204526.78584: variable 'profile' from source: play vars 44842 1727204526.78591: variable 'interface' from source: set_fact 44842 1727204526.78668: variable 'interface' from source: set_fact 44842 1727204526.78737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204526.78767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204526.78794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204526.78825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204526.78874: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204526.79206: variable 'network_connections' from source: play vars 44842 1727204526.79209: variable 'profile' from source: play vars 44842 1727204526.79356: variable 'profile' from source: play vars 44842 1727204526.79360: variable 'interface' from source: set_fact 44842 1727204526.79425: variable 'interface' from source: set_fact 44842 1727204526.79474: variable '__network_packages_default_wireless' from source: role '' defaults 44842 1727204526.79567: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204526.79905: variable 'network_connections' from source: play vars 44842 1727204526.79908: variable 'profile' from source: play vars 44842 1727204526.79984: variable 'profile' from source: play vars 44842 1727204526.79988: variable 'interface' from source: set_fact 44842 1727204526.80109: variable 'interface' from source: set_fact 44842 1727204526.80134: variable '__network_packages_default_team' from source: role '' defaults 44842 1727204526.80218: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204526.80553: variable 'network_connections' from source: play vars 44842 1727204526.80557: variable 'profile' from source: play vars 44842 1727204526.80628: variable 'profile' from source: play vars 44842 1727204526.80632: variable 'interface' from source: set_fact 44842 1727204526.80745: variable 'interface' from source: set_fact 44842 1727204526.80807: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204526.80870: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204526.80877: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204526.80941: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204526.81173: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44842 1727204526.81728: variable 'network_connections' from source: play vars 44842 1727204526.81731: variable 'profile' from source: play vars 44842 1727204526.81799: variable 'profile' from source: play vars 44842 1727204526.81802: variable 'interface' from source: set_fact 44842 1727204526.81872: variable 'interface' from source: set_fact 44842 1727204526.81884: variable 'ansible_distribution' from source: facts 44842 1727204526.81887: variable '__network_rh_distros' from source: role '' defaults 44842 1727204526.81893: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.81906: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44842 1727204526.82107: variable 'ansible_distribution' from source: facts 44842 1727204526.82110: variable '__network_rh_distros' from source: role '' defaults 44842 1727204526.82113: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.82130: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44842 1727204526.82302: variable 'ansible_distribution' from source: facts 44842 1727204526.82306: variable '__network_rh_distros' from source: role '' defaults 44842 1727204526.82314: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.82349: variable 'network_provider' from source: set_fact 44842 1727204526.82367: variable 'ansible_facts' from source: unknown 44842 1727204526.83388: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44842 1727204526.83392: when evaluation is False, skipping this task 44842 1727204526.83395: _execute() done 44842 1727204526.83423: dumping result to json 44842 1727204526.83428: done dumping result, returning 44842 1727204526.83434: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-aad0-d242-000000000099] 44842 1727204526.83437: sending task result for task 0affcd87-79f5-aad0-d242-000000000099 skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44842 1727204526.83632: no more pending results, returning what we have 44842 1727204526.83637: results queue empty 44842 1727204526.83638: checking for any_errors_fatal 44842 1727204526.83648: done checking for any_errors_fatal 44842 1727204526.83649: checking for max_fail_percentage 44842 1727204526.83652: done checking for max_fail_percentage 44842 1727204526.83653: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.83657: done checking to see if all hosts have failed 44842 1727204526.83658: getting the remaining hosts for this loop 44842 1727204526.83662: done getting the remaining hosts for this loop 44842 1727204526.83669: getting the next task for host managed-node1 44842 1727204526.83676: done getting next task for host managed-node1 44842 1727204526.83681: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44842 1727204526.83684: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.83701: getting variables 44842 1727204526.83703: in VariableManager get_vars() 44842 1727204526.83747: Calling all_inventory to load vars for managed-node1 44842 1727204526.83750: Calling groups_inventory to load vars for managed-node1 44842 1727204526.83755: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.83773: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.83782: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.83787: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.84306: done sending task result for task 0affcd87-79f5-aad0-d242-000000000099 44842 1727204526.84310: WORKER PROCESS EXITING 44842 1727204526.85634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.87768: done with get_vars() 44842 1727204526.87793: done getting variables 44842 1727204526.87888: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.168) 0:00:37.047 ***** 44842 1727204526.87932: entering _queue_task() for managed-node1/package 44842 1727204526.88388: worker is 1 (out of 1 available) 44842 1727204526.88407: exiting _queue_task() for managed-node1/package 44842 1727204526.88425: done queuing things up, now waiting for results queue to drain 44842 1727204526.88426: waiting for pending results... 44842 1727204526.88765: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44842 1727204526.88893: in run() - task 0affcd87-79f5-aad0-d242-00000000009a 44842 1727204526.88913: variable 'ansible_search_path' from source: unknown 44842 1727204526.88917: variable 'ansible_search_path' from source: unknown 44842 1727204526.88952: calling self._execute() 44842 1727204526.89079: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.89083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.89094: variable 'omit' from source: magic vars 44842 1727204526.89584: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.89596: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.89723: variable 'network_state' from source: role '' defaults 44842 1727204526.89734: Evaluated conditional (network_state != {}): False 44842 1727204526.89738: when evaluation is False, skipping this task 44842 1727204526.89741: _execute() done 44842 1727204526.89743: dumping result to json 44842 1727204526.89745: done dumping result, returning 44842 1727204526.89753: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-aad0-d242-00000000009a] 44842 1727204526.89758: sending task result for task 0affcd87-79f5-aad0-d242-00000000009a 44842 1727204526.89865: done sending task result for task 0affcd87-79f5-aad0-d242-00000000009a 44842 1727204526.89869: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204526.89932: no more pending results, returning what we have 44842 1727204526.89936: results queue empty 44842 1727204526.89937: checking for any_errors_fatal 44842 1727204526.89943: done checking for any_errors_fatal 44842 1727204526.89944: checking for max_fail_percentage 44842 1727204526.89946: done checking for max_fail_percentage 44842 1727204526.89947: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.89948: done checking to see if all hosts have failed 44842 1727204526.89948: getting the remaining hosts for this loop 44842 1727204526.89950: done getting the remaining hosts for this loop 44842 1727204526.89954: getting the next task for host managed-node1 44842 1727204526.89967: done getting next task for host managed-node1 44842 1727204526.89971: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44842 1727204526.89973: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.89989: getting variables 44842 1727204526.89991: in VariableManager get_vars() 44842 1727204526.90029: Calling all_inventory to load vars for managed-node1 44842 1727204526.90032: Calling groups_inventory to load vars for managed-node1 44842 1727204526.90035: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.90047: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.90050: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.90053: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.92345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204526.94247: done with get_vars() 44842 1727204526.94281: done getting variables 44842 1727204526.94340: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:02:06 -0400 (0:00:00.064) 0:00:37.111 ***** 44842 1727204526.94378: entering _queue_task() for managed-node1/package 44842 1727204526.94715: worker is 1 (out of 1 available) 44842 1727204526.94727: exiting _queue_task() for managed-node1/package 44842 1727204526.94738: done queuing things up, now waiting for results queue to drain 44842 1727204526.94740: waiting for pending results... 44842 1727204526.95031: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44842 1727204526.95124: in run() - task 0affcd87-79f5-aad0-d242-00000000009b 44842 1727204526.95136: variable 'ansible_search_path' from source: unknown 44842 1727204526.95139: variable 'ansible_search_path' from source: unknown 44842 1727204526.95183: calling self._execute() 44842 1727204526.95908: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204526.95912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204526.95924: variable 'omit' from source: magic vars 44842 1727204526.96317: variable 'ansible_distribution_major_version' from source: facts 44842 1727204526.96329: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204526.96460: variable 'network_state' from source: role '' defaults 44842 1727204526.96476: Evaluated conditional (network_state != {}): False 44842 1727204526.96480: when evaluation is False, skipping this task 44842 1727204526.96482: _execute() done 44842 1727204526.96485: dumping result to json 44842 1727204526.96487: done dumping result, returning 44842 1727204526.96494: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-aad0-d242-00000000009b] 44842 1727204526.96500: sending task result for task 0affcd87-79f5-aad0-d242-00000000009b skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204526.96644: no more pending results, returning what we have 44842 1727204526.96648: results queue empty 44842 1727204526.96650: checking for any_errors_fatal 44842 1727204526.96657: done checking for any_errors_fatal 44842 1727204526.96658: checking for max_fail_percentage 44842 1727204526.96662: done checking for max_fail_percentage 44842 1727204526.96663: checking to see if all hosts have failed and the running result is not ok 44842 1727204526.96666: done checking to see if all hosts have failed 44842 1727204526.96666: getting the remaining hosts for this loop 44842 1727204526.96668: done getting the remaining hosts for this loop 44842 1727204526.96672: getting the next task for host managed-node1 44842 1727204526.96680: done getting next task for host managed-node1 44842 1727204526.96684: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44842 1727204526.96686: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204526.96705: getting variables 44842 1727204526.96707: in VariableManager get_vars() 44842 1727204526.96746: Calling all_inventory to load vars for managed-node1 44842 1727204526.96749: Calling groups_inventory to load vars for managed-node1 44842 1727204526.96751: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204526.96770: Calling all_plugins_play to load vars for managed-node1 44842 1727204526.96773: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204526.96777: Calling groups_plugins_play to load vars for managed-node1 44842 1727204526.97472: done sending task result for task 0affcd87-79f5-aad0-d242-00000000009b 44842 1727204526.97475: WORKER PROCESS EXITING 44842 1727204526.99449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204527.02881: done with get_vars() 44842 1727204527.02915: done getting variables 44842 1727204527.02979: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.086) 0:00:37.198 ***** 44842 1727204527.03011: entering _queue_task() for managed-node1/service 44842 1727204527.03340: worker is 1 (out of 1 available) 44842 1727204527.03353: exiting _queue_task() for managed-node1/service 44842 1727204527.03368: done queuing things up, now waiting for results queue to drain 44842 1727204527.03370: waiting for pending results... 44842 1727204527.03651: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44842 1727204527.03749: in run() - task 0affcd87-79f5-aad0-d242-00000000009c 44842 1727204527.03765: variable 'ansible_search_path' from source: unknown 44842 1727204527.03770: variable 'ansible_search_path' from source: unknown 44842 1727204527.03801: calling self._execute() 44842 1727204527.03894: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204527.03902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204527.03911: variable 'omit' from source: magic vars 44842 1727204527.04269: variable 'ansible_distribution_major_version' from source: facts 44842 1727204527.04281: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204527.04391: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204527.04584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204527.08445: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204527.08517: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204527.08550: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204527.08586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204527.08613: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204527.08695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.08735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.08757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.08793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.08815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.08858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.08881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.08909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.08946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.08958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.08996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.09019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.09043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.09083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.09098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.09518: variable 'network_connections' from source: play vars 44842 1727204527.09531: variable 'profile' from source: play vars 44842 1727204527.09723: variable 'profile' from source: play vars 44842 1727204527.09726: variable 'interface' from source: set_fact 44842 1727204527.09906: variable 'interface' from source: set_fact 44842 1727204527.10013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204527.10366: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204527.10404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204527.10549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204527.10590: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204527.10630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204527.10767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204527.10795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.10821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204527.10987: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204527.11530: variable 'network_connections' from source: play vars 44842 1727204527.11536: variable 'profile' from source: play vars 44842 1727204527.11601: variable 'profile' from source: play vars 44842 1727204527.11604: variable 'interface' from source: set_fact 44842 1727204527.11780: variable 'interface' from source: set_fact 44842 1727204527.11813: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44842 1727204527.11816: when evaluation is False, skipping this task 44842 1727204527.11818: _execute() done 44842 1727204527.11820: dumping result to json 44842 1727204527.11823: done dumping result, returning 44842 1727204527.11825: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-aad0-d242-00000000009c] 44842 1727204527.11835: sending task result for task 0affcd87-79f5-aad0-d242-00000000009c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44842 1727204527.12128: no more pending results, returning what we have 44842 1727204527.12132: results queue empty 44842 1727204527.12133: checking for any_errors_fatal 44842 1727204527.12140: done checking for any_errors_fatal 44842 1727204527.12141: checking for max_fail_percentage 44842 1727204527.12143: done checking for max_fail_percentage 44842 1727204527.12144: checking to see if all hosts have failed and the running result is not ok 44842 1727204527.12144: done checking to see if all hosts have failed 44842 1727204527.12145: getting the remaining hosts for this loop 44842 1727204527.12147: done getting the remaining hosts for this loop 44842 1727204527.12152: getting the next task for host managed-node1 44842 1727204527.12159: done getting next task for host managed-node1 44842 1727204527.12166: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44842 1727204527.12168: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204527.12187: getting variables 44842 1727204527.12189: in VariableManager get_vars() 44842 1727204527.12232: Calling all_inventory to load vars for managed-node1 44842 1727204527.12235: Calling groups_inventory to load vars for managed-node1 44842 1727204527.12238: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204527.12250: Calling all_plugins_play to load vars for managed-node1 44842 1727204527.12253: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204527.12257: Calling groups_plugins_play to load vars for managed-node1 44842 1727204527.12870: done sending task result for task 0affcd87-79f5-aad0-d242-00000000009c 44842 1727204527.12873: WORKER PROCESS EXITING 44842 1727204527.14832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204527.18454: done with get_vars() 44842 1727204527.18495: done getting variables 44842 1727204527.18558: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.155) 0:00:37.354 ***** 44842 1727204527.18592: entering _queue_task() for managed-node1/service 44842 1727204527.18927: worker is 1 (out of 1 available) 44842 1727204527.18939: exiting _queue_task() for managed-node1/service 44842 1727204527.18951: done queuing things up, now waiting for results queue to drain 44842 1727204527.18952: waiting for pending results... 44842 1727204527.19237: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44842 1727204527.19333: in run() - task 0affcd87-79f5-aad0-d242-00000000009d 44842 1727204527.19347: variable 'ansible_search_path' from source: unknown 44842 1727204527.19352: variable 'ansible_search_path' from source: unknown 44842 1727204527.19390: calling self._execute() 44842 1727204527.19492: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204527.19497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204527.19512: variable 'omit' from source: magic vars 44842 1727204527.19884: variable 'ansible_distribution_major_version' from source: facts 44842 1727204527.19898: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204527.20059: variable 'network_provider' from source: set_fact 44842 1727204527.20068: variable 'network_state' from source: role '' defaults 44842 1727204527.20076: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44842 1727204527.20082: variable 'omit' from source: magic vars 44842 1727204527.20125: variable 'omit' from source: magic vars 44842 1727204527.20153: variable 'network_service_name' from source: role '' defaults 44842 1727204527.20229: variable 'network_service_name' from source: role '' defaults 44842 1727204527.20335: variable '__network_provider_setup' from source: role '' defaults 44842 1727204527.20340: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204527.20408: variable '__network_service_name_default_nm' from source: role '' defaults 44842 1727204527.20416: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204527.20480: variable '__network_packages_default_nm' from source: role '' defaults 44842 1727204527.20714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204527.23336: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204527.23395: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204527.23433: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204527.23468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204527.23494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204527.23568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.23595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.23620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.23666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.23677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.23721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.23745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.23767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.23800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.23813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.24039: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44842 1727204527.24148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.24174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.24198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.24232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.24244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.24330: variable 'ansible_python' from source: facts 44842 1727204527.24353: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44842 1727204527.24447: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204527.24530: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204527.24657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.24681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.24706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.24805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.24935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.24988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204527.25011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204527.25146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.25190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204527.25203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204527.25454: variable 'network_connections' from source: play vars 44842 1727204527.25464: variable 'profile' from source: play vars 44842 1727204527.25652: variable 'profile' from source: play vars 44842 1727204527.25658: variable 'interface' from source: set_fact 44842 1727204527.25851: variable 'interface' from source: set_fact 44842 1727204527.26075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204527.26483: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204527.26533: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204527.26598: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204527.26638: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204527.27171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204527.27174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204527.27176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204527.27178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204527.27180: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204527.27513: variable 'network_connections' from source: play vars 44842 1727204527.27520: variable 'profile' from source: play vars 44842 1727204527.27602: variable 'profile' from source: play vars 44842 1727204527.27606: variable 'interface' from source: set_fact 44842 1727204527.27671: variable 'interface' from source: set_fact 44842 1727204527.27705: variable '__network_packages_default_wireless' from source: role '' defaults 44842 1727204527.27788: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204527.28075: variable 'network_connections' from source: play vars 44842 1727204527.28080: variable 'profile' from source: play vars 44842 1727204527.28145: variable 'profile' from source: play vars 44842 1727204527.28148: variable 'interface' from source: set_fact 44842 1727204527.28226: variable 'interface' from source: set_fact 44842 1727204527.28255: variable '__network_packages_default_team' from source: role '' defaults 44842 1727204527.28336: variable '__network_team_connections_defined' from source: role '' defaults 44842 1727204527.28693: variable 'network_connections' from source: play vars 44842 1727204527.28696: variable 'profile' from source: play vars 44842 1727204527.28950: variable 'profile' from source: play vars 44842 1727204527.28955: variable 'interface' from source: set_fact 44842 1727204527.29031: variable 'interface' from source: set_fact 44842 1727204527.29194: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204527.29246: variable '__network_service_name_default_initscripts' from source: role '' defaults 44842 1727204527.29252: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204527.29423: variable '__network_packages_default_initscripts' from source: role '' defaults 44842 1727204527.29812: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44842 1727204527.30957: variable 'network_connections' from source: play vars 44842 1727204527.30965: variable 'profile' from source: play vars 44842 1727204527.31067: variable 'profile' from source: play vars 44842 1727204527.31071: variable 'interface' from source: set_fact 44842 1727204527.31230: variable 'interface' from source: set_fact 44842 1727204527.31239: variable 'ansible_distribution' from source: facts 44842 1727204527.31242: variable '__network_rh_distros' from source: role '' defaults 44842 1727204527.31248: variable 'ansible_distribution_major_version' from source: facts 44842 1727204527.31266: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44842 1727204527.31442: variable 'ansible_distribution' from source: facts 44842 1727204527.31448: variable '__network_rh_distros' from source: role '' defaults 44842 1727204527.31454: variable 'ansible_distribution_major_version' from source: facts 44842 1727204527.31468: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44842 1727204527.31970: variable 'ansible_distribution' from source: facts 44842 1727204527.31974: variable '__network_rh_distros' from source: role '' defaults 44842 1727204527.31976: variable 'ansible_distribution_major_version' from source: facts 44842 1727204527.31978: variable 'network_provider' from source: set_fact 44842 1727204527.31980: variable 'omit' from source: magic vars 44842 1727204527.31982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204527.31985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204527.31988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204527.31990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204527.31993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204527.31996: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204527.31998: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204527.32001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204527.32003: Set connection var ansible_shell_type to sh 44842 1727204527.32005: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204527.32008: Set connection var ansible_connection to ssh 44842 1727204527.32010: Set connection var ansible_pipelining to False 44842 1727204527.32013: Set connection var ansible_timeout to 10 44842 1727204527.32015: Set connection var ansible_shell_executable to /bin/sh 44842 1727204527.32030: variable 'ansible_shell_executable' from source: unknown 44842 1727204527.32032: variable 'ansible_connection' from source: unknown 44842 1727204527.32035: variable 'ansible_module_compression' from source: unknown 44842 1727204527.32037: variable 'ansible_shell_type' from source: unknown 44842 1727204527.32040: variable 'ansible_shell_executable' from source: unknown 44842 1727204527.32044: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204527.32051: variable 'ansible_pipelining' from source: unknown 44842 1727204527.32053: variable 'ansible_timeout' from source: unknown 44842 1727204527.32055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204527.32162: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204527.32171: variable 'omit' from source: magic vars 44842 1727204527.32176: starting attempt loop 44842 1727204527.32180: running the handler 44842 1727204527.32262: variable 'ansible_facts' from source: unknown 44842 1727204527.33028: _low_level_execute_command(): starting 44842 1727204527.33035: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204527.34808: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204527.34819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.34830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.34849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.34892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.34899: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204527.34909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.34923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204527.34930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204527.34936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204527.34944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.34959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.34972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.34980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.34986: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204527.34997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.35072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204527.35092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204527.35104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204527.35198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204527.37072: stdout chunk (state=3): >>>/root <<< 44842 1727204527.37076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204527.37079: stderr chunk (state=3): >>><<< 44842 1727204527.37081: stdout chunk (state=3): >>><<< 44842 1727204527.37100: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204527.37111: _low_level_execute_command(): starting 44842 1727204527.37119: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815 `" && echo ansible-tmp-1727204527.3710008-47542-101695544343815="` echo /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815 `" ) && sleep 0' 44842 1727204527.38148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204527.38156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.38171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.38189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.38228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.38235: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204527.38245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.38262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204527.38266: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204527.38278: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204527.38286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.38299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.38311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.38318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.38325: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204527.38334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.38407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204527.38428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204527.38442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204527.38524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204527.40385: stdout chunk (state=3): >>>ansible-tmp-1727204527.3710008-47542-101695544343815=/root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815 <<< 44842 1727204527.40575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204527.40579: stdout chunk (state=3): >>><<< 44842 1727204527.40586: stderr chunk (state=3): >>><<< 44842 1727204527.40605: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204527.3710008-47542-101695544343815=/root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204527.40638: variable 'ansible_module_compression' from source: unknown 44842 1727204527.40694: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44842 1727204527.40756: variable 'ansible_facts' from source: unknown 44842 1727204527.40947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815/AnsiballZ_systemd.py 44842 1727204527.41436: Sending initial data 44842 1727204527.41439: Sent initial data (156 bytes) 44842 1727204527.43665: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204527.43673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.43687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.43699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.43738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.43744: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204527.43752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.43766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204527.43774: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204527.43781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204527.43789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.43802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.43812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.43818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.43824: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204527.43833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.43908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204527.44006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204527.44071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204527.45783: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204527.45831: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204527.45893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpqx7_scv3 /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815/AnsiballZ_systemd.py <<< 44842 1727204527.45947: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204527.49374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204527.49377: stderr chunk (state=3): >>><<< 44842 1727204527.49379: stdout chunk (state=3): >>><<< 44842 1727204527.49381: done transferring module to remote 44842 1727204527.49383: _low_level_execute_command(): starting 44842 1727204527.49385: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815/ /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815/AnsiballZ_systemd.py && sleep 0' 44842 1727204527.50240: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.50244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.50275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204527.50289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.50292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.50369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204527.50372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204527.50374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204527.50443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204527.52158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204527.52241: stderr chunk (state=3): >>><<< 44842 1727204527.52244: stdout chunk (state=3): >>><<< 44842 1727204527.52325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204527.52328: _low_level_execute_command(): starting 44842 1727204527.52335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815/AnsiballZ_systemd.py && sleep 0' 44842 1727204527.53903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204527.53921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.53938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.53958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.54019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.54032: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204527.54048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.54086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204527.54102: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204527.54114: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204527.54126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.54141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.54215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.54228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.54240: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204527.54255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.54446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204527.54466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204527.54482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204527.54646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204527.79662: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 44842 1727204527.79698: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "14204928", "MemoryAvailable": "infinity", "CPUUsageNSec": "1688823000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogS<<< 44842 1727204527.79702: stdout chunk (state=3): >>>ignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44842 1727204527.81297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204527.81362: stderr chunk (state=3): >>><<< 44842 1727204527.81368: stdout chunk (state=3): >>><<< 44842 1727204527.81673: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14204928", "MemoryAvailable": "infinity", "CPUUsageNSec": "1688823000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204527.81685: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204527.81687: _low_level_execute_command(): starting 44842 1727204527.81690: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204527.3710008-47542-101695544343815/ > /dev/null 2>&1 && sleep 0' 44842 1727204527.82334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204527.82357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.82377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.82396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.82440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.82453: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204527.82481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.82498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204527.82509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204527.82519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204527.82530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204527.82542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204527.82558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204527.82582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204527.82594: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204527.82608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204527.82685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204527.82715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204527.82731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204527.82828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204527.84649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204527.84762: stderr chunk (state=3): >>><<< 44842 1727204527.84782: stdout chunk (state=3): >>><<< 44842 1727204527.84869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204527.84872: handler run complete 44842 1727204527.85074: attempt loop complete, returning result 44842 1727204527.85077: _execute() done 44842 1727204527.85079: dumping result to json 44842 1727204527.85080: done dumping result, returning 44842 1727204527.85082: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-aad0-d242-00000000009d] 44842 1727204527.85084: sending task result for task 0affcd87-79f5-aad0-d242-00000000009d 44842 1727204527.85191: done sending task result for task 0affcd87-79f5-aad0-d242-00000000009d 44842 1727204527.85194: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204527.85246: no more pending results, returning what we have 44842 1727204527.85250: results queue empty 44842 1727204527.85251: checking for any_errors_fatal 44842 1727204527.85258: done checking for any_errors_fatal 44842 1727204527.85258: checking for max_fail_percentage 44842 1727204527.85262: done checking for max_fail_percentage 44842 1727204527.85263: checking to see if all hosts have failed and the running result is not ok 44842 1727204527.85265: done checking to see if all hosts have failed 44842 1727204527.85266: getting the remaining hosts for this loop 44842 1727204527.85268: done getting the remaining hosts for this loop 44842 1727204527.85272: getting the next task for host managed-node1 44842 1727204527.85279: done getting next task for host managed-node1 44842 1727204527.85283: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44842 1727204527.85285: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204527.85294: getting variables 44842 1727204527.85296: in VariableManager get_vars() 44842 1727204527.85338: Calling all_inventory to load vars for managed-node1 44842 1727204527.85341: Calling groups_inventory to load vars for managed-node1 44842 1727204527.85344: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204527.85356: Calling all_plugins_play to load vars for managed-node1 44842 1727204527.85358: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204527.85372: Calling groups_plugins_play to load vars for managed-node1 44842 1727204527.87584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204527.92142: done with get_vars() 44842 1727204527.92184: done getting variables 44842 1727204527.92245: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:02:07 -0400 (0:00:00.736) 0:00:38.090 ***** 44842 1727204527.92291: entering _queue_task() for managed-node1/service 44842 1727204527.92649: worker is 1 (out of 1 available) 44842 1727204527.92679: exiting _queue_task() for managed-node1/service 44842 1727204527.94136: done queuing things up, now waiting for results queue to drain 44842 1727204527.94138: waiting for pending results... 44842 1727204527.94194: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44842 1727204527.94318: in run() - task 0affcd87-79f5-aad0-d242-00000000009e 44842 1727204527.94482: variable 'ansible_search_path' from source: unknown 44842 1727204527.94491: variable 'ansible_search_path' from source: unknown 44842 1727204527.94535: calling self._execute() 44842 1727204527.94812: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204527.94826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204527.94842: variable 'omit' from source: magic vars 44842 1727204527.95626: variable 'ansible_distribution_major_version' from source: facts 44842 1727204527.95782: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204527.96019: variable 'network_provider' from source: set_fact 44842 1727204527.96023: Evaluated conditional (network_provider == "nm"): True 44842 1727204527.96119: variable '__network_wpa_supplicant_required' from source: role '' defaults 44842 1727204527.96226: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44842 1727204527.96393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204527.99848: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204527.99922: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204527.99958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204527.99996: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204528.00024: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204528.00120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204528.00148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204528.00178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204528.00219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204528.00232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204528.00282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204528.00305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204528.00331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204528.00377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204528.00385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204528.00425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204528.00448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204528.00479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204528.00516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204528.00530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204528.00690: variable 'network_connections' from source: play vars 44842 1727204528.00703: variable 'profile' from source: play vars 44842 1727204528.00782: variable 'profile' from source: play vars 44842 1727204528.00785: variable 'interface' from source: set_fact 44842 1727204528.00845: variable 'interface' from source: set_fact 44842 1727204528.00919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44842 1727204528.01368: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44842 1727204528.01372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44842 1727204528.01374: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44842 1727204528.01377: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44842 1727204528.01379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44842 1727204528.01381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44842 1727204528.01383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204528.01385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44842 1727204528.01388: variable '__network_wireless_connections_defined' from source: role '' defaults 44842 1727204528.01622: variable 'network_connections' from source: play vars 44842 1727204528.01626: variable 'profile' from source: play vars 44842 1727204528.01693: variable 'profile' from source: play vars 44842 1727204528.01697: variable 'interface' from source: set_fact 44842 1727204528.01755: variable 'interface' from source: set_fact 44842 1727204528.01785: Evaluated conditional (__network_wpa_supplicant_required): False 44842 1727204528.01789: when evaluation is False, skipping this task 44842 1727204528.01791: _execute() done 44842 1727204528.01803: dumping result to json 44842 1727204528.01806: done dumping result, returning 44842 1727204528.01808: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-aad0-d242-00000000009e] 44842 1727204528.01810: sending task result for task 0affcd87-79f5-aad0-d242-00000000009e 44842 1727204528.01903: done sending task result for task 0affcd87-79f5-aad0-d242-00000000009e 44842 1727204528.01905: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44842 1727204528.01982: no more pending results, returning what we have 44842 1727204528.01986: results queue empty 44842 1727204528.01987: checking for any_errors_fatal 44842 1727204528.02004: done checking for any_errors_fatal 44842 1727204528.02004: checking for max_fail_percentage 44842 1727204528.02006: done checking for max_fail_percentage 44842 1727204528.02007: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.02008: done checking to see if all hosts have failed 44842 1727204528.02009: getting the remaining hosts for this loop 44842 1727204528.02010: done getting the remaining hosts for this loop 44842 1727204528.02015: getting the next task for host managed-node1 44842 1727204528.02021: done getting next task for host managed-node1 44842 1727204528.02026: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44842 1727204528.02027: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.02044: getting variables 44842 1727204528.02046: in VariableManager get_vars() 44842 1727204528.02085: Calling all_inventory to load vars for managed-node1 44842 1727204528.02088: Calling groups_inventory to load vars for managed-node1 44842 1727204528.02090: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.02099: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.02101: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.02104: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.03789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.05776: done with get_vars() 44842 1727204528.05802: done getting variables 44842 1727204528.05875: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.136) 0:00:38.227 ***** 44842 1727204528.05926: entering _queue_task() for managed-node1/service 44842 1727204528.06291: worker is 1 (out of 1 available) 44842 1727204528.06303: exiting _queue_task() for managed-node1/service 44842 1727204528.06315: done queuing things up, now waiting for results queue to drain 44842 1727204528.06316: waiting for pending results... 44842 1727204528.06593: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 44842 1727204528.06693: in run() - task 0affcd87-79f5-aad0-d242-00000000009f 44842 1727204528.06716: variable 'ansible_search_path' from source: unknown 44842 1727204528.06723: variable 'ansible_search_path' from source: unknown 44842 1727204528.06773: calling self._execute() 44842 1727204528.06882: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.06892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.06903: variable 'omit' from source: magic vars 44842 1727204528.07289: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.07308: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.07427: variable 'network_provider' from source: set_fact 44842 1727204528.07438: Evaluated conditional (network_provider == "initscripts"): False 44842 1727204528.07459: when evaluation is False, skipping this task 44842 1727204528.07476: _execute() done 44842 1727204528.07483: dumping result to json 44842 1727204528.07489: done dumping result, returning 44842 1727204528.07497: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-aad0-d242-00000000009f] 44842 1727204528.07506: sending task result for task 0affcd87-79f5-aad0-d242-00000000009f 44842 1727204528.07610: done sending task result for task 0affcd87-79f5-aad0-d242-00000000009f skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44842 1727204528.07662: no more pending results, returning what we have 44842 1727204528.07668: results queue empty 44842 1727204528.07670: checking for any_errors_fatal 44842 1727204528.07678: done checking for any_errors_fatal 44842 1727204528.07679: checking for max_fail_percentage 44842 1727204528.07682: done checking for max_fail_percentage 44842 1727204528.07683: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.07684: done checking to see if all hosts have failed 44842 1727204528.07685: getting the remaining hosts for this loop 44842 1727204528.07687: done getting the remaining hosts for this loop 44842 1727204528.07691: getting the next task for host managed-node1 44842 1727204528.07699: done getting next task for host managed-node1 44842 1727204528.07703: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44842 1727204528.07705: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.07723: getting variables 44842 1727204528.07725: in VariableManager get_vars() 44842 1727204528.07763: Calling all_inventory to load vars for managed-node1 44842 1727204528.07767: Calling groups_inventory to load vars for managed-node1 44842 1727204528.07769: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.07781: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.07784: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.07786: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.08866: WORKER PROCESS EXITING 44842 1727204528.09790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.11542: done with get_vars() 44842 1727204528.11572: done getting variables 44842 1727204528.11636: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.057) 0:00:38.284 ***** 44842 1727204528.11671: entering _queue_task() for managed-node1/copy 44842 1727204528.11995: worker is 1 (out of 1 available) 44842 1727204528.12008: exiting _queue_task() for managed-node1/copy 44842 1727204528.12020: done queuing things up, now waiting for results queue to drain 44842 1727204528.12021: waiting for pending results... 44842 1727204528.12311: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44842 1727204528.12426: in run() - task 0affcd87-79f5-aad0-d242-0000000000a0 44842 1727204528.12446: variable 'ansible_search_path' from source: unknown 44842 1727204528.12453: variable 'ansible_search_path' from source: unknown 44842 1727204528.12501: calling self._execute() 44842 1727204528.12606: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.12617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.12629: variable 'omit' from source: magic vars 44842 1727204528.13013: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.13037: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.13151: variable 'network_provider' from source: set_fact 44842 1727204528.13161: Evaluated conditional (network_provider == "initscripts"): False 44842 1727204528.13169: when evaluation is False, skipping this task 44842 1727204528.13175: _execute() done 44842 1727204528.13180: dumping result to json 44842 1727204528.13185: done dumping result, returning 44842 1727204528.13192: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-aad0-d242-0000000000a0] 44842 1727204528.13201: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a0 44842 1727204528.13313: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a0 44842 1727204528.13321: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44842 1727204528.13381: no more pending results, returning what we have 44842 1727204528.13385: results queue empty 44842 1727204528.13386: checking for any_errors_fatal 44842 1727204528.13394: done checking for any_errors_fatal 44842 1727204528.13395: checking for max_fail_percentage 44842 1727204528.13396: done checking for max_fail_percentage 44842 1727204528.13398: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.13398: done checking to see if all hosts have failed 44842 1727204528.13399: getting the remaining hosts for this loop 44842 1727204528.13402: done getting the remaining hosts for this loop 44842 1727204528.13406: getting the next task for host managed-node1 44842 1727204528.13417: done getting next task for host managed-node1 44842 1727204528.13422: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44842 1727204528.13424: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.13441: getting variables 44842 1727204528.13443: in VariableManager get_vars() 44842 1727204528.13488: Calling all_inventory to load vars for managed-node1 44842 1727204528.13492: Calling groups_inventory to load vars for managed-node1 44842 1727204528.13494: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.13508: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.13511: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.13514: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.15918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.17677: done with get_vars() 44842 1727204528.17706: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.061) 0:00:38.346 ***** 44842 1727204528.17797: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44842 1727204528.18134: worker is 1 (out of 1 available) 44842 1727204528.18147: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 44842 1727204528.18158: done queuing things up, now waiting for results queue to drain 44842 1727204528.18160: waiting for pending results... 44842 1727204528.18445: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44842 1727204528.18549: in run() - task 0affcd87-79f5-aad0-d242-0000000000a1 44842 1727204528.18575: variable 'ansible_search_path' from source: unknown 44842 1727204528.18582: variable 'ansible_search_path' from source: unknown 44842 1727204528.18630: calling self._execute() 44842 1727204528.18740: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.18750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.18766: variable 'omit' from source: magic vars 44842 1727204528.19159: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.19180: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.19192: variable 'omit' from source: magic vars 44842 1727204528.19230: variable 'omit' from source: magic vars 44842 1727204528.19410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44842 1727204528.21846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44842 1727204528.21927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44842 1727204528.21968: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44842 1727204528.22016: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44842 1727204528.22046: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44842 1727204528.22139: variable 'network_provider' from source: set_fact 44842 1727204528.22277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44842 1727204528.22333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44842 1727204528.22362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44842 1727204528.22410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44842 1727204528.22435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44842 1727204528.22513: variable 'omit' from source: magic vars 44842 1727204528.22642: variable 'omit' from source: magic vars 44842 1727204528.22757: variable 'network_connections' from source: play vars 44842 1727204528.22774: variable 'profile' from source: play vars 44842 1727204528.22843: variable 'profile' from source: play vars 44842 1727204528.22857: variable 'interface' from source: set_fact 44842 1727204528.22925: variable 'interface' from source: set_fact 44842 1727204528.23079: variable 'omit' from source: magic vars 44842 1727204528.23094: variable '__lsr_ansible_managed' from source: task vars 44842 1727204528.23155: variable '__lsr_ansible_managed' from source: task vars 44842 1727204528.23424: Loaded config def from plugin (lookup/template) 44842 1727204528.23433: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44842 1727204528.23463: File lookup term: get_ansible_managed.j2 44842 1727204528.23475: variable 'ansible_search_path' from source: unknown 44842 1727204528.23485: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44842 1727204528.23505: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44842 1727204528.23535: variable 'ansible_search_path' from source: unknown 44842 1727204528.34277: variable 'ansible_managed' from source: unknown 44842 1727204528.34368: variable 'omit' from source: magic vars 44842 1727204528.34388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204528.34404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204528.34417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204528.34428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204528.34436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204528.34451: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204528.34455: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.34457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.34518: Set connection var ansible_shell_type to sh 44842 1727204528.34527: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204528.34530: Set connection var ansible_connection to ssh 44842 1727204528.34536: Set connection var ansible_pipelining to False 44842 1727204528.34541: Set connection var ansible_timeout to 10 44842 1727204528.34548: Set connection var ansible_shell_executable to /bin/sh 44842 1727204528.34567: variable 'ansible_shell_executable' from source: unknown 44842 1727204528.34570: variable 'ansible_connection' from source: unknown 44842 1727204528.34572: variable 'ansible_module_compression' from source: unknown 44842 1727204528.34574: variable 'ansible_shell_type' from source: unknown 44842 1727204528.34577: variable 'ansible_shell_executable' from source: unknown 44842 1727204528.34579: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.34583: variable 'ansible_pipelining' from source: unknown 44842 1727204528.34585: variable 'ansible_timeout' from source: unknown 44842 1727204528.34589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.34678: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204528.34689: variable 'omit' from source: magic vars 44842 1727204528.34691: starting attempt loop 44842 1727204528.34694: running the handler 44842 1727204528.34705: _low_level_execute_command(): starting 44842 1727204528.34708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204528.35562: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204528.35624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204528.37268: stdout chunk (state=3): >>>/root <<< 44842 1727204528.37374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204528.37427: stderr chunk (state=3): >>><<< 44842 1727204528.37430: stdout chunk (state=3): >>><<< 44842 1727204528.37470: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204528.37473: _low_level_execute_command(): starting 44842 1727204528.37476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816 `" && echo ansible-tmp-1727204528.3744285-47647-208617236889816="` echo /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816 `" ) && sleep 0' 44842 1727204528.37899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204528.37903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204528.37942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.37945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204528.37948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.37997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204528.38001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204528.38068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204528.39905: stdout chunk (state=3): >>>ansible-tmp-1727204528.3744285-47647-208617236889816=/root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816 <<< 44842 1727204528.40025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204528.40077: stderr chunk (state=3): >>><<< 44842 1727204528.40082: stdout chunk (state=3): >>><<< 44842 1727204528.40096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204528.3744285-47647-208617236889816=/root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204528.40136: variable 'ansible_module_compression' from source: unknown 44842 1727204528.40170: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44842 1727204528.40206: variable 'ansible_facts' from source: unknown 44842 1727204528.40293: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816/AnsiballZ_network_connections.py 44842 1727204528.40404: Sending initial data 44842 1727204528.40407: Sent initial data (168 bytes) 44842 1727204528.41075: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204528.41081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204528.41093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204528.41126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.41131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204528.41140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204528.41148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204528.41157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204528.41165: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.41229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204528.41232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204528.41235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204528.41294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204528.42981: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44842 1727204528.42990: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44842 1727204528.42998: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 44842 1727204528.43005: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 44842 1727204528.43012: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 44842 1727204528.43021: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 44842 1727204528.43033: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 44842 1727204528.43041: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 44842 1727204528.43047: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204528.43108: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 44842 1727204528.43115: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 44842 1727204528.43122: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 44842 1727204528.43187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpd94wu2oz /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816/AnsiballZ_network_connections.py <<< 44842 1727204528.43252: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204528.44465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204528.44577: stderr chunk (state=3): >>><<< 44842 1727204528.44580: stdout chunk (state=3): >>><<< 44842 1727204528.44597: done transferring module to remote 44842 1727204528.44606: _low_level_execute_command(): starting 44842 1727204528.44611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816/ /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816/AnsiballZ_network_connections.py && sleep 0' 44842 1727204528.45071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204528.45078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204528.45109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 44842 1727204528.45121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204528.45131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.45184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204528.45193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204528.45266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204528.46967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204528.47020: stderr chunk (state=3): >>><<< 44842 1727204528.47026: stdout chunk (state=3): >>><<< 44842 1727204528.47047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204528.47051: _low_level_execute_command(): starting 44842 1727204528.47055: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816/AnsiballZ_network_connections.py && sleep 0' 44842 1727204528.47514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204528.47520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204528.47553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.47568: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.47625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204528.47638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204528.47705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204528.72056: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cg_19ze_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cg_19ze_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 44842 1727204528.72072: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/7ac3bb1f-688e-4ad4-89b8-fb40a9966f33: error=unknown <<< 44842 1727204528.72232: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44842 1727204528.73669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204528.73725: stderr chunk (state=3): >>><<< 44842 1727204528.73729: stdout chunk (state=3): >>><<< 44842 1727204528.73745: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cg_19ze_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cg_19ze_/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/7ac3bb1f-688e-4ad4-89b8-fb40a9966f33: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204528.73776: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204528.73785: _low_level_execute_command(): starting 44842 1727204528.73790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204528.3744285-47647-208617236889816/ > /dev/null 2>&1 && sleep 0' 44842 1727204528.74239: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204528.74246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204528.74284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.74290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204528.74300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204528.74305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204528.74311: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204528.74320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204528.74391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204528.74394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204528.74396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204528.74451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204528.76254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204528.76331: stderr chunk (state=3): >>><<< 44842 1727204528.76335: stdout chunk (state=3): >>><<< 44842 1727204528.76348: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204528.76356: handler run complete 44842 1727204528.76471: attempt loop complete, returning result 44842 1727204528.76474: _execute() done 44842 1727204528.76476: dumping result to json 44842 1727204528.76478: done dumping result, returning 44842 1727204528.76480: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-aad0-d242-0000000000a1] 44842 1727204528.76482: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a1 44842 1727204528.76546: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a1 44842 1727204528.76549: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 44842 1727204528.76664: no more pending results, returning what we have 44842 1727204528.76670: results queue empty 44842 1727204528.76671: checking for any_errors_fatal 44842 1727204528.76677: done checking for any_errors_fatal 44842 1727204528.76678: checking for max_fail_percentage 44842 1727204528.76680: done checking for max_fail_percentage 44842 1727204528.76680: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.76681: done checking to see if all hosts have failed 44842 1727204528.76682: getting the remaining hosts for this loop 44842 1727204528.76684: done getting the remaining hosts for this loop 44842 1727204528.76688: getting the next task for host managed-node1 44842 1727204528.76693: done getting next task for host managed-node1 44842 1727204528.76697: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44842 1727204528.76699: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.76709: getting variables 44842 1727204528.76710: in VariableManager get_vars() 44842 1727204528.76745: Calling all_inventory to load vars for managed-node1 44842 1727204528.76747: Calling groups_inventory to load vars for managed-node1 44842 1727204528.76749: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.76758: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.76760: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.76763: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.77624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.79225: done with get_vars() 44842 1727204528.79258: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.615) 0:00:38.962 ***** 44842 1727204528.79392: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44842 1727204528.79754: worker is 1 (out of 1 available) 44842 1727204528.79771: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 44842 1727204528.79785: done queuing things up, now waiting for results queue to drain 44842 1727204528.79787: waiting for pending results... 44842 1727204528.80199: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 44842 1727204528.80352: in run() - task 0affcd87-79f5-aad0-d242-0000000000a2 44842 1727204528.80373: variable 'ansible_search_path' from source: unknown 44842 1727204528.80376: variable 'ansible_search_path' from source: unknown 44842 1727204528.80407: calling self._execute() 44842 1727204528.80501: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.80505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.80513: variable 'omit' from source: magic vars 44842 1727204528.80809: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.80819: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.80908: variable 'network_state' from source: role '' defaults 44842 1727204528.80918: Evaluated conditional (network_state != {}): False 44842 1727204528.80921: when evaluation is False, skipping this task 44842 1727204528.80924: _execute() done 44842 1727204528.80927: dumping result to json 44842 1727204528.80929: done dumping result, returning 44842 1727204528.80935: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-aad0-d242-0000000000a2] 44842 1727204528.80941: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a2 44842 1727204528.81031: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a2 44842 1727204528.81034: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44842 1727204528.81086: no more pending results, returning what we have 44842 1727204528.81091: results queue empty 44842 1727204528.81092: checking for any_errors_fatal 44842 1727204528.81100: done checking for any_errors_fatal 44842 1727204528.81100: checking for max_fail_percentage 44842 1727204528.81102: done checking for max_fail_percentage 44842 1727204528.81103: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.81104: done checking to see if all hosts have failed 44842 1727204528.81105: getting the remaining hosts for this loop 44842 1727204528.81106: done getting the remaining hosts for this loop 44842 1727204528.81110: getting the next task for host managed-node1 44842 1727204528.81117: done getting next task for host managed-node1 44842 1727204528.81121: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44842 1727204528.81123: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.81137: getting variables 44842 1727204528.81139: in VariableManager get_vars() 44842 1727204528.81177: Calling all_inventory to load vars for managed-node1 44842 1727204528.81179: Calling groups_inventory to load vars for managed-node1 44842 1727204528.81181: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.81191: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.81193: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.81196: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.86077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.87024: done with get_vars() 44842 1727204528.87045: done getting variables 44842 1727204528.87087: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.077) 0:00:39.039 ***** 44842 1727204528.87107: entering _queue_task() for managed-node1/debug 44842 1727204528.87349: worker is 1 (out of 1 available) 44842 1727204528.87372: exiting _queue_task() for managed-node1/debug 44842 1727204528.87386: done queuing things up, now waiting for results queue to drain 44842 1727204528.87387: waiting for pending results... 44842 1727204528.87579: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44842 1727204528.87649: in run() - task 0affcd87-79f5-aad0-d242-0000000000a3 44842 1727204528.87666: variable 'ansible_search_path' from source: unknown 44842 1727204528.87670: variable 'ansible_search_path' from source: unknown 44842 1727204528.87698: calling self._execute() 44842 1727204528.87782: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.87786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.87794: variable 'omit' from source: magic vars 44842 1727204528.88084: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.88095: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.88102: variable 'omit' from source: magic vars 44842 1727204528.88133: variable 'omit' from source: magic vars 44842 1727204528.88158: variable 'omit' from source: magic vars 44842 1727204528.88197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204528.88223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204528.88241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204528.88256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204528.88268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204528.88293: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204528.88297: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.88300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.88369: Set connection var ansible_shell_type to sh 44842 1727204528.88378: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204528.88383: Set connection var ansible_connection to ssh 44842 1727204528.88389: Set connection var ansible_pipelining to False 44842 1727204528.88394: Set connection var ansible_timeout to 10 44842 1727204528.88400: Set connection var ansible_shell_executable to /bin/sh 44842 1727204528.88418: variable 'ansible_shell_executable' from source: unknown 44842 1727204528.88421: variable 'ansible_connection' from source: unknown 44842 1727204528.88424: variable 'ansible_module_compression' from source: unknown 44842 1727204528.88427: variable 'ansible_shell_type' from source: unknown 44842 1727204528.88429: variable 'ansible_shell_executable' from source: unknown 44842 1727204528.88431: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.88434: variable 'ansible_pipelining' from source: unknown 44842 1727204528.88437: variable 'ansible_timeout' from source: unknown 44842 1727204528.88440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.88543: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204528.88552: variable 'omit' from source: magic vars 44842 1727204528.88557: starting attempt loop 44842 1727204528.88563: running the handler 44842 1727204528.88656: variable '__network_connections_result' from source: set_fact 44842 1727204528.88700: handler run complete 44842 1727204528.88714: attempt loop complete, returning result 44842 1727204528.88718: _execute() done 44842 1727204528.88720: dumping result to json 44842 1727204528.88723: done dumping result, returning 44842 1727204528.88730: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-aad0-d242-0000000000a3] 44842 1727204528.88735: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a3 44842 1727204528.88832: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a3 44842 1727204528.88834: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 44842 1727204528.88900: no more pending results, returning what we have 44842 1727204528.88907: results queue empty 44842 1727204528.88908: checking for any_errors_fatal 44842 1727204528.88918: done checking for any_errors_fatal 44842 1727204528.88919: checking for max_fail_percentage 44842 1727204528.88920: done checking for max_fail_percentage 44842 1727204528.88921: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.88922: done checking to see if all hosts have failed 44842 1727204528.88923: getting the remaining hosts for this loop 44842 1727204528.88925: done getting the remaining hosts for this loop 44842 1727204528.88928: getting the next task for host managed-node1 44842 1727204528.88934: done getting next task for host managed-node1 44842 1727204528.88938: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44842 1727204528.88940: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.88949: getting variables 44842 1727204528.88951: in VariableManager get_vars() 44842 1727204528.88991: Calling all_inventory to load vars for managed-node1 44842 1727204528.88994: Calling groups_inventory to load vars for managed-node1 44842 1727204528.88996: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.89012: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.89016: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.89019: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.89874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.90851: done with get_vars() 44842 1727204528.90873: done getting variables 44842 1727204528.90917: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.038) 0:00:39.077 ***** 44842 1727204528.90941: entering _queue_task() for managed-node1/debug 44842 1727204528.91188: worker is 1 (out of 1 available) 44842 1727204528.91201: exiting _queue_task() for managed-node1/debug 44842 1727204528.91213: done queuing things up, now waiting for results queue to drain 44842 1727204528.91214: waiting for pending results... 44842 1727204528.91407: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44842 1727204528.91483: in run() - task 0affcd87-79f5-aad0-d242-0000000000a4 44842 1727204528.91499: variable 'ansible_search_path' from source: unknown 44842 1727204528.91503: variable 'ansible_search_path' from source: unknown 44842 1727204528.91532: calling self._execute() 44842 1727204528.91618: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.91625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.91634: variable 'omit' from source: magic vars 44842 1727204528.91912: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.91941: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.91954: variable 'omit' from source: magic vars 44842 1727204528.91984: variable 'omit' from source: magic vars 44842 1727204528.92011: variable 'omit' from source: magic vars 44842 1727204528.92045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204528.92078: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204528.92095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204528.92108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204528.92118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204528.92143: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204528.92148: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.92150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.92224: Set connection var ansible_shell_type to sh 44842 1727204528.92233: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204528.92238: Set connection var ansible_connection to ssh 44842 1727204528.92243: Set connection var ansible_pipelining to False 44842 1727204528.92248: Set connection var ansible_timeout to 10 44842 1727204528.92254: Set connection var ansible_shell_executable to /bin/sh 44842 1727204528.92277: variable 'ansible_shell_executable' from source: unknown 44842 1727204528.92280: variable 'ansible_connection' from source: unknown 44842 1727204528.92284: variable 'ansible_module_compression' from source: unknown 44842 1727204528.92287: variable 'ansible_shell_type' from source: unknown 44842 1727204528.92289: variable 'ansible_shell_executable' from source: unknown 44842 1727204528.92292: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.92295: variable 'ansible_pipelining' from source: unknown 44842 1727204528.92297: variable 'ansible_timeout' from source: unknown 44842 1727204528.92299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.92401: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204528.92414: variable 'omit' from source: magic vars 44842 1727204528.92418: starting attempt loop 44842 1727204528.92420: running the handler 44842 1727204528.92456: variable '__network_connections_result' from source: set_fact 44842 1727204528.92519: variable '__network_connections_result' from source: set_fact 44842 1727204528.92594: handler run complete 44842 1727204528.92610: attempt loop complete, returning result 44842 1727204528.92614: _execute() done 44842 1727204528.92616: dumping result to json 44842 1727204528.92618: done dumping result, returning 44842 1727204528.92626: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-aad0-d242-0000000000a4] 44842 1727204528.92634: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a4 44842 1727204528.92727: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a4 44842 1727204528.92730: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 44842 1727204528.92830: no more pending results, returning what we have 44842 1727204528.92834: results queue empty 44842 1727204528.92835: checking for any_errors_fatal 44842 1727204528.92840: done checking for any_errors_fatal 44842 1727204528.92841: checking for max_fail_percentage 44842 1727204528.92842: done checking for max_fail_percentage 44842 1727204528.92843: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.92844: done checking to see if all hosts have failed 44842 1727204528.92849: getting the remaining hosts for this loop 44842 1727204528.92851: done getting the remaining hosts for this loop 44842 1727204528.92854: getting the next task for host managed-node1 44842 1727204528.92862: done getting next task for host managed-node1 44842 1727204528.92867: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44842 1727204528.92868: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.92877: getting variables 44842 1727204528.92879: in VariableManager get_vars() 44842 1727204528.92912: Calling all_inventory to load vars for managed-node1 44842 1727204528.92915: Calling groups_inventory to load vars for managed-node1 44842 1727204528.92917: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.92929: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.92932: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.92935: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.93874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.94855: done with get_vars() 44842 1727204528.94878: done getting variables 44842 1727204528.94923: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.040) 0:00:39.117 ***** 44842 1727204528.94951: entering _queue_task() for managed-node1/debug 44842 1727204528.95192: worker is 1 (out of 1 available) 44842 1727204528.95204: exiting _queue_task() for managed-node1/debug 44842 1727204528.95216: done queuing things up, now waiting for results queue to drain 44842 1727204528.95218: waiting for pending results... 44842 1727204528.95406: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44842 1727204528.95476: in run() - task 0affcd87-79f5-aad0-d242-0000000000a5 44842 1727204528.95488: variable 'ansible_search_path' from source: unknown 44842 1727204528.95492: variable 'ansible_search_path' from source: unknown 44842 1727204528.95521: calling self._execute() 44842 1727204528.95604: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.95608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.95617: variable 'omit' from source: magic vars 44842 1727204528.95891: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.95902: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.95988: variable 'network_state' from source: role '' defaults 44842 1727204528.95999: Evaluated conditional (network_state != {}): False 44842 1727204528.96002: when evaluation is False, skipping this task 44842 1727204528.96005: _execute() done 44842 1727204528.96008: dumping result to json 44842 1727204528.96010: done dumping result, returning 44842 1727204528.96013: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-aad0-d242-0000000000a5] 44842 1727204528.96020: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a5 44842 1727204528.96110: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a5 44842 1727204528.96113: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 44842 1727204528.96159: no more pending results, returning what we have 44842 1727204528.96166: results queue empty 44842 1727204528.96167: checking for any_errors_fatal 44842 1727204528.96178: done checking for any_errors_fatal 44842 1727204528.96178: checking for max_fail_percentage 44842 1727204528.96180: done checking for max_fail_percentage 44842 1727204528.96181: checking to see if all hosts have failed and the running result is not ok 44842 1727204528.96182: done checking to see if all hosts have failed 44842 1727204528.96182: getting the remaining hosts for this loop 44842 1727204528.96184: done getting the remaining hosts for this loop 44842 1727204528.96188: getting the next task for host managed-node1 44842 1727204528.96194: done getting next task for host managed-node1 44842 1727204528.96198: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44842 1727204528.96200: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204528.96214: getting variables 44842 1727204528.96215: in VariableManager get_vars() 44842 1727204528.96255: Calling all_inventory to load vars for managed-node1 44842 1727204528.96257: Calling groups_inventory to load vars for managed-node1 44842 1727204528.96262: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204528.96273: Calling all_plugins_play to load vars for managed-node1 44842 1727204528.96275: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204528.96278: Calling groups_plugins_play to load vars for managed-node1 44842 1727204528.97117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204528.98704: done with get_vars() 44842 1727204528.98727: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:02:08 -0400 (0:00:00.038) 0:00:39.156 ***** 44842 1727204528.98841: entering _queue_task() for managed-node1/ping 44842 1727204528.99084: worker is 1 (out of 1 available) 44842 1727204528.99098: exiting _queue_task() for managed-node1/ping 44842 1727204528.99111: done queuing things up, now waiting for results queue to drain 44842 1727204528.99112: waiting for pending results... 44842 1727204528.99310: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 44842 1727204528.99375: in run() - task 0affcd87-79f5-aad0-d242-0000000000a6 44842 1727204528.99392: variable 'ansible_search_path' from source: unknown 44842 1727204528.99396: variable 'ansible_search_path' from source: unknown 44842 1727204528.99425: calling self._execute() 44842 1727204528.99513: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204528.99516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204528.99526: variable 'omit' from source: magic vars 44842 1727204528.99890: variable 'ansible_distribution_major_version' from source: facts 44842 1727204528.99899: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204528.99907: variable 'omit' from source: magic vars 44842 1727204528.99949: variable 'omit' from source: magic vars 44842 1727204529.00030: variable 'omit' from source: magic vars 44842 1727204529.00034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204529.00062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204529.00088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204529.00105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204529.00116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204529.00148: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204529.00151: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204529.00154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204529.00254: Set connection var ansible_shell_type to sh 44842 1727204529.00269: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204529.00276: Set connection var ansible_connection to ssh 44842 1727204529.00279: Set connection var ansible_pipelining to False 44842 1727204529.00285: Set connection var ansible_timeout to 10 44842 1727204529.00293: Set connection var ansible_shell_executable to /bin/sh 44842 1727204529.00316: variable 'ansible_shell_executable' from source: unknown 44842 1727204529.00319: variable 'ansible_connection' from source: unknown 44842 1727204529.00321: variable 'ansible_module_compression' from source: unknown 44842 1727204529.00324: variable 'ansible_shell_type' from source: unknown 44842 1727204529.00326: variable 'ansible_shell_executable' from source: unknown 44842 1727204529.00328: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204529.00332: variable 'ansible_pipelining' from source: unknown 44842 1727204529.00335: variable 'ansible_timeout' from source: unknown 44842 1727204529.00339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204529.00550: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204529.00563: variable 'omit' from source: magic vars 44842 1727204529.00574: starting attempt loop 44842 1727204529.00577: running the handler 44842 1727204529.00775: _low_level_execute_command(): starting 44842 1727204529.00778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204529.01481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.01484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.01488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.01490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.01492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.01495: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.01497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.01499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.01502: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.01504: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.01507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.01509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.01570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.01573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.01575: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.01577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.01673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.01677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.01679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.01718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.03285: stdout chunk (state=3): >>>/root <<< 44842 1727204529.03455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.03458: stdout chunk (state=3): >>><<< 44842 1727204529.03470: stderr chunk (state=3): >>><<< 44842 1727204529.03491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204529.03503: _low_level_execute_command(): starting 44842 1727204529.03510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110 `" && echo ansible-tmp-1727204529.0349038-47683-183625362575110="` echo /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110 `" ) && sleep 0' 44842 1727204529.04257: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.04267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.04282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.04296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.04333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.04340: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.04350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.04367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.04372: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.04379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.04386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.04396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.04407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.04412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.04420: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.04428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.04504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.04521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.04533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.04617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.06452: stdout chunk (state=3): >>>ansible-tmp-1727204529.0349038-47683-183625362575110=/root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110 <<< 44842 1727204529.06641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.06645: stdout chunk (state=3): >>><<< 44842 1727204529.06652: stderr chunk (state=3): >>><<< 44842 1727204529.06690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204529.0349038-47683-183625362575110=/root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204529.06734: variable 'ansible_module_compression' from source: unknown 44842 1727204529.06779: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44842 1727204529.06820: variable 'ansible_facts' from source: unknown 44842 1727204529.06897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110/AnsiballZ_ping.py 44842 1727204529.07176: Sending initial data 44842 1727204529.07180: Sent initial data (153 bytes) 44842 1727204529.08838: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.08853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.08879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.08902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.08945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.08958: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.08976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.08991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.09001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.09010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.09019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.09030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.09042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.09052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.09070: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.09082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.09156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.09185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.09200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.09408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.10980: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204529.11036: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204529.11098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmployuiza2 /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110/AnsiballZ_ping.py <<< 44842 1727204529.11151: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204529.12331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.12465: stderr chunk (state=3): >>><<< 44842 1727204529.12469: stdout chunk (state=3): >>><<< 44842 1727204529.12472: done transferring module to remote 44842 1727204529.12474: _low_level_execute_command(): starting 44842 1727204529.12481: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110/ /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110/AnsiballZ_ping.py && sleep 0' 44842 1727204529.13083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.13097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.13112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.13130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.13180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.13192: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.13206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.13224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.13236: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.13247: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.13259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.13280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.13298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.13310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.13321: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.13335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.13415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.13431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.13446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.13773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.15341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.15438: stderr chunk (state=3): >>><<< 44842 1727204529.15448: stdout chunk (state=3): >>><<< 44842 1727204529.15565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204529.15569: _low_level_execute_command(): starting 44842 1727204529.15571: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110/AnsiballZ_ping.py && sleep 0' 44842 1727204529.17452: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.17474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.17489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.17511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.17559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.17632: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.17647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.17669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.17680: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.17690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.17700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.17713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.17733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.17750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.17852: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.17872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.17952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.17982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.17997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.18097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.30871: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44842 1727204529.32001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204529.32005: stdout chunk (state=3): >>><<< 44842 1727204529.32007: stderr chunk (state=3): >>><<< 44842 1727204529.32133: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204529.32138: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204529.32145: _low_level_execute_command(): starting 44842 1727204529.32147: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204529.0349038-47683-183625362575110/ > /dev/null 2>&1 && sleep 0' 44842 1727204529.32759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.32775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.32789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.32812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.32855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.32887: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.32905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.32929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.32940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.32950: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.32960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.32976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.32991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.33001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.33011: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.33024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.33105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.33126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.33149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.33235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.35161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.35167: stdout chunk (state=3): >>><<< 44842 1727204529.35169: stderr chunk (state=3): >>><<< 44842 1727204529.35649: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204529.35653: handler run complete 44842 1727204529.35656: attempt loop complete, returning result 44842 1727204529.35658: _execute() done 44842 1727204529.35660: dumping result to json 44842 1727204529.35662: done dumping result, returning 44842 1727204529.35666: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-aad0-d242-0000000000a6] 44842 1727204529.35668: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a6 44842 1727204529.35761: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a6 44842 1727204529.35771: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 44842 1727204529.35828: no more pending results, returning what we have 44842 1727204529.35831: results queue empty 44842 1727204529.35832: checking for any_errors_fatal 44842 1727204529.35837: done checking for any_errors_fatal 44842 1727204529.35838: checking for max_fail_percentage 44842 1727204529.35840: done checking for max_fail_percentage 44842 1727204529.35841: checking to see if all hosts have failed and the running result is not ok 44842 1727204529.35842: done checking to see if all hosts have failed 44842 1727204529.35843: getting the remaining hosts for this loop 44842 1727204529.35844: done getting the remaining hosts for this loop 44842 1727204529.35848: getting the next task for host managed-node1 44842 1727204529.35854: done getting next task for host managed-node1 44842 1727204529.35857: ^ task is: TASK: meta (role_complete) 44842 1727204529.35859: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204529.35875: getting variables 44842 1727204529.35877: in VariableManager get_vars() 44842 1727204529.35922: Calling all_inventory to load vars for managed-node1 44842 1727204529.35926: Calling groups_inventory to load vars for managed-node1 44842 1727204529.35929: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204529.35940: Calling all_plugins_play to load vars for managed-node1 44842 1727204529.35949: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204529.35957: Calling groups_plugins_play to load vars for managed-node1 44842 1727204529.38331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204529.40748: done with get_vars() 44842 1727204529.40785: done getting variables 44842 1727204529.40871: done queuing things up, now waiting for results queue to drain 44842 1727204529.40874: results queue empty 44842 1727204529.40874: checking for any_errors_fatal 44842 1727204529.40878: done checking for any_errors_fatal 44842 1727204529.40879: checking for max_fail_percentage 44842 1727204529.40880: done checking for max_fail_percentage 44842 1727204529.40880: checking to see if all hosts have failed and the running result is not ok 44842 1727204529.40881: done checking to see if all hosts have failed 44842 1727204529.40882: getting the remaining hosts for this loop 44842 1727204529.40883: done getting the remaining hosts for this loop 44842 1727204529.40886: getting the next task for host managed-node1 44842 1727204529.40890: done getting next task for host managed-node1 44842 1727204529.40891: ^ task is: TASK: meta (flush_handlers) 44842 1727204529.40893: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204529.40896: getting variables 44842 1727204529.40897: in VariableManager get_vars() 44842 1727204529.40909: Calling all_inventory to load vars for managed-node1 44842 1727204529.40911: Calling groups_inventory to load vars for managed-node1 44842 1727204529.40913: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204529.40923: Calling all_plugins_play to load vars for managed-node1 44842 1727204529.40925: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204529.40928: Calling groups_plugins_play to load vars for managed-node1 44842 1727204529.42390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204529.44310: done with get_vars() 44842 1727204529.44337: done getting variables 44842 1727204529.44394: in VariableManager get_vars() 44842 1727204529.44436: Calling all_inventory to load vars for managed-node1 44842 1727204529.44440: Calling groups_inventory to load vars for managed-node1 44842 1727204529.44443: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204529.44448: Calling all_plugins_play to load vars for managed-node1 44842 1727204529.44450: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204529.44453: Calling groups_plugins_play to load vars for managed-node1 44842 1727204529.45868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204529.50071: done with get_vars() 44842 1727204529.50113: done queuing things up, now waiting for results queue to drain 44842 1727204529.50116: results queue empty 44842 1727204529.50117: checking for any_errors_fatal 44842 1727204529.50118: done checking for any_errors_fatal 44842 1727204529.50119: checking for max_fail_percentage 44842 1727204529.50120: done checking for max_fail_percentage 44842 1727204529.50121: checking to see if all hosts have failed and the running result is not ok 44842 1727204529.50121: done checking to see if all hosts have failed 44842 1727204529.50122: getting the remaining hosts for this loop 44842 1727204529.50123: done getting the remaining hosts for this loop 44842 1727204529.50126: getting the next task for host managed-node1 44842 1727204529.50130: done getting next task for host managed-node1 44842 1727204529.50132: ^ task is: TASK: meta (flush_handlers) 44842 1727204529.50133: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204529.50252: getting variables 44842 1727204529.50254: in VariableManager get_vars() 44842 1727204529.50270: Calling all_inventory to load vars for managed-node1 44842 1727204529.50274: Calling groups_inventory to load vars for managed-node1 44842 1727204529.50276: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204529.50282: Calling all_plugins_play to load vars for managed-node1 44842 1727204529.50284: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204529.50287: Calling groups_plugins_play to load vars for managed-node1 44842 1727204529.53180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204529.55551: done with get_vars() 44842 1727204529.55583: done getting variables 44842 1727204529.55661: in VariableManager get_vars() 44842 1727204529.55678: Calling all_inventory to load vars for managed-node1 44842 1727204529.55680: Calling groups_inventory to load vars for managed-node1 44842 1727204529.55682: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204529.55687: Calling all_plugins_play to load vars for managed-node1 44842 1727204529.55690: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204529.55693: Calling groups_plugins_play to load vars for managed-node1 44842 1727204529.57072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204529.60919: done with get_vars() 44842 1727204529.60948: done queuing things up, now waiting for results queue to drain 44842 1727204529.60951: results queue empty 44842 1727204529.60952: checking for any_errors_fatal 44842 1727204529.61067: done checking for any_errors_fatal 44842 1727204529.61069: checking for max_fail_percentage 44842 1727204529.61070: done checking for max_fail_percentage 44842 1727204529.61071: checking to see if all hosts have failed and the running result is not ok 44842 1727204529.61072: done checking to see if all hosts have failed 44842 1727204529.61075: getting the remaining hosts for this loop 44842 1727204529.61076: done getting the remaining hosts for this loop 44842 1727204529.61079: getting the next task for host managed-node1 44842 1727204529.61083: done getting next task for host managed-node1 44842 1727204529.61084: ^ task is: None 44842 1727204529.61085: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204529.61086: done queuing things up, now waiting for results queue to drain 44842 1727204529.61087: results queue empty 44842 1727204529.61088: checking for any_errors_fatal 44842 1727204529.61089: done checking for any_errors_fatal 44842 1727204529.61090: checking for max_fail_percentage 44842 1727204529.61091: done checking for max_fail_percentage 44842 1727204529.61091: checking to see if all hosts have failed and the running result is not ok 44842 1727204529.61092: done checking to see if all hosts have failed 44842 1727204529.61093: getting the next task for host managed-node1 44842 1727204529.61096: done getting next task for host managed-node1 44842 1727204529.61096: ^ task is: None 44842 1727204529.61098: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204529.61192: in VariableManager get_vars() 44842 1727204529.61209: done with get_vars() 44842 1727204529.61215: in VariableManager get_vars() 44842 1727204529.61224: done with get_vars() 44842 1727204529.61228: variable 'omit' from source: magic vars 44842 1727204529.61259: in VariableManager get_vars() 44842 1727204529.61272: done with get_vars() 44842 1727204529.61409: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 44842 1727204529.61887: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44842 1727204529.62479: getting the remaining hosts for this loop 44842 1727204529.62480: done getting the remaining hosts for this loop 44842 1727204529.62484: getting the next task for host managed-node1 44842 1727204529.62487: done getting next task for host managed-node1 44842 1727204529.62490: ^ task is: TASK: Gathering Facts 44842 1727204529.62491: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204529.62493: getting variables 44842 1727204529.62494: in VariableManager get_vars() 44842 1727204529.62505: Calling all_inventory to load vars for managed-node1 44842 1727204529.62507: Calling groups_inventory to load vars for managed-node1 44842 1727204529.62510: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204529.62515: Calling all_plugins_play to load vars for managed-node1 44842 1727204529.62518: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204529.62521: Calling groups_plugins_play to load vars for managed-node1 44842 1727204529.64659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204529.67178: done with get_vars() 44842 1727204529.67221: done getting variables 44842 1727204529.67272: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Tuesday 24 September 2024 15:02:09 -0400 (0:00:00.685) 0:00:39.842 ***** 44842 1727204529.67419: entering _queue_task() for managed-node1/gather_facts 44842 1727204529.68186: worker is 1 (out of 1 available) 44842 1727204529.68198: exiting _queue_task() for managed-node1/gather_facts 44842 1727204529.68271: done queuing things up, now waiting for results queue to drain 44842 1727204529.68273: waiting for pending results... 44842 1727204529.69019: running TaskExecutor() for managed-node1/TASK: Gathering Facts 44842 1727204529.69253: in run() - task 0affcd87-79f5-aad0-d242-00000000066a 44842 1727204529.69278: variable 'ansible_search_path' from source: unknown 44842 1727204529.69443: calling self._execute() 44842 1727204529.69592: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204529.69661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204529.69683: variable 'omit' from source: magic vars 44842 1727204529.70481: variable 'ansible_distribution_major_version' from source: facts 44842 1727204529.70650: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204529.70662: variable 'omit' from source: magic vars 44842 1727204529.70696: variable 'omit' from source: magic vars 44842 1727204529.70812: variable 'omit' from source: magic vars 44842 1727204529.70866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204529.70910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204529.70937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204529.70970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204529.70987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204529.71027: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204529.71036: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204529.71044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204529.71155: Set connection var ansible_shell_type to sh 44842 1727204529.71180: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204529.71192: Set connection var ansible_connection to ssh 44842 1727204529.71202: Set connection var ansible_pipelining to False 44842 1727204529.71217: Set connection var ansible_timeout to 10 44842 1727204529.71230: Set connection var ansible_shell_executable to /bin/sh 44842 1727204529.71258: variable 'ansible_shell_executable' from source: unknown 44842 1727204529.71268: variable 'ansible_connection' from source: unknown 44842 1727204529.71276: variable 'ansible_module_compression' from source: unknown 44842 1727204529.71288: variable 'ansible_shell_type' from source: unknown 44842 1727204529.71294: variable 'ansible_shell_executable' from source: unknown 44842 1727204529.71300: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204529.71308: variable 'ansible_pipelining' from source: unknown 44842 1727204529.71315: variable 'ansible_timeout' from source: unknown 44842 1727204529.71325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204529.71786: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204529.71808: variable 'omit' from source: magic vars 44842 1727204529.71817: starting attempt loop 44842 1727204529.71832: running the handler 44842 1727204529.71853: variable 'ansible_facts' from source: unknown 44842 1727204529.71885: _low_level_execute_command(): starting 44842 1727204529.71897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204529.72770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.72787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.72811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.72833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.72948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.72961: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.72979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.73003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.73016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.73035: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.73050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.73066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.73084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.73098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.73108: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.73120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.73204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.73226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.73240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.73336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.74966: stdout chunk (state=3): >>>/root <<< 44842 1727204529.75121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.75191: stderr chunk (state=3): >>><<< 44842 1727204529.75194: stdout chunk (state=3): >>><<< 44842 1727204529.75297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204529.75320: _low_level_execute_command(): starting 44842 1727204529.75323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881 `" && echo ansible-tmp-1727204529.752174-47788-84862059878881="` echo /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881 `" ) && sleep 0' 44842 1727204529.75978: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.75992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.76006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.76023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.76076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.76089: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.76103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.76120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.76131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.76142: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.76161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.76179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.76195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.76207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.76218: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.76231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.76318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.76339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.76355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.76447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.78285: stdout chunk (state=3): >>>ansible-tmp-1727204529.752174-47788-84862059878881=/root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881 <<< 44842 1727204529.78498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.78501: stdout chunk (state=3): >>><<< 44842 1727204529.78504: stderr chunk (state=3): >>><<< 44842 1727204529.78675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204529.752174-47788-84862059878881=/root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204529.78680: variable 'ansible_module_compression' from source: unknown 44842 1727204529.78682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44842 1727204529.78785: variable 'ansible_facts' from source: unknown 44842 1727204529.78952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881/AnsiballZ_setup.py 44842 1727204529.79215: Sending initial data 44842 1727204529.79218: Sent initial data (152 bytes) 44842 1727204529.80258: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.80277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.80294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.80314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.80369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.80384: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.80399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.80417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.80430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.80451: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.80469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.80485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.80502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.80515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.80527: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.80542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.80630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.80653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.80681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.80776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.82488: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204529.82534: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204529.82591: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmp6rqgxm3t /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881/AnsiballZ_setup.py <<< 44842 1727204529.82638: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204529.85244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.85383: stderr chunk (state=3): >>><<< 44842 1727204529.85387: stdout chunk (state=3): >>><<< 44842 1727204529.85389: done transferring module to remote 44842 1727204529.85392: _low_level_execute_command(): starting 44842 1727204529.85398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881/ /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881/AnsiballZ_setup.py && sleep 0' 44842 1727204529.86010: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.86614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.86632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.86651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.86718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.86835: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.86852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.86901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.87008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.87020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.87033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.87047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.87066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.87082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.87094: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.87107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.87181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.87234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.87258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.87358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204529.89053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204529.89143: stderr chunk (state=3): >>><<< 44842 1727204529.89147: stdout chunk (state=3): >>><<< 44842 1727204529.89240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204529.89244: _low_level_execute_command(): starting 44842 1727204529.89247: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881/AnsiballZ_setup.py && sleep 0' 44842 1727204529.89815: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204529.89830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.89845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.89867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.89910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.89924: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204529.89939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.89956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204529.89972: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204529.89984: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204529.89997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204529.90010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204529.90025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204529.90037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204529.90048: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204529.90061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204529.90138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204529.90157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204529.90174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204529.90272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204530.41206: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/ro<<< 44842 1727204530.41236: stdout chunk (state=3): >>>ot", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "10", "epoch": "1727204530", "epoch_int": "1727204530", "date": "2024-09-24", "time": "15:02:10", "iso8601_micro": "2024-09-24T19:02:10.141331Z", "iso8601": "2024-09-24T19:02:10Z", "iso8601_basic": "20240924T150210141331", "iso8601_basic_short": "20240924T150210", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.31, "5m": 0.41, "15m": 0.28}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2778, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 754, "free": 2778}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28<<< 44842 1727204530.41268: stdout chunk (state=3): >>>c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 793, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271765504, "block_size": 4096, "block_total": 65519355, "block_available": 64519474, "block_used": 999881, "inode_total": 131071472, "inode_available": 130998229, "inode_used": 73243, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed<<< 44842 1727204530.41299: stdout chunk (state=3): >>>]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "<<< 44842 1727204530.41304: stdout chunk (state=3): >>>on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44842 1727204530.42940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204530.42944: stdout chunk (state=3): >>><<< 44842 1727204530.42947: stderr chunk (state=3): >>><<< 44842 1727204530.43081: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "02", "second": "10", "epoch": "1727204530", "epoch_int": "1727204530", "date": "2024-09-24", "time": "15:02:10", "iso8601_micro": "2024-09-24T19:02:10.141331Z", "iso8601": "2024-09-24T19:02:10Z", "iso8601_basic": "20240924T150210141331", "iso8601_basic_short": "20240924T150210", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.31, "5m": 0.41, "15m": 0.28}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2778, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 754, "free": 2778}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 793, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271765504, "block_size": 4096, "block_total": 65519355, "block_available": 64519474, "block_used": 999881, "inode_total": 131071472, "inode_available": 130998229, "inode_used": 73243, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204530.43476: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204530.43504: _low_level_execute_command(): starting 44842 1727204530.43516: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204529.752174-47788-84862059878881/ > /dev/null 2>&1 && sleep 0' 44842 1727204530.44421: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204530.44425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.44458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.44470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.44473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.44543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204530.44546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204530.44551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204530.44614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204530.46377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204530.46476: stderr chunk (state=3): >>><<< 44842 1727204530.46490: stdout chunk (state=3): >>><<< 44842 1727204530.46569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204530.46573: handler run complete 44842 1727204530.46703: variable 'ansible_facts' from source: unknown 44842 1727204530.46856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.47200: variable 'ansible_facts' from source: unknown 44842 1727204530.47311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.47473: attempt loop complete, returning result 44842 1727204530.47491: _execute() done 44842 1727204530.47499: dumping result to json 44842 1727204530.47535: done dumping result, returning 44842 1727204530.47546: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-aad0-d242-00000000066a] 44842 1727204530.47555: sending task result for task 0affcd87-79f5-aad0-d242-00000000066a ok: [managed-node1] 44842 1727204530.48324: no more pending results, returning what we have 44842 1727204530.48329: results queue empty 44842 1727204530.48330: checking for any_errors_fatal 44842 1727204530.48331: done checking for any_errors_fatal 44842 1727204530.48331: checking for max_fail_percentage 44842 1727204530.48333: done checking for max_fail_percentage 44842 1727204530.48334: checking to see if all hosts have failed and the running result is not ok 44842 1727204530.48334: done checking to see if all hosts have failed 44842 1727204530.48335: getting the remaining hosts for this loop 44842 1727204530.48337: done getting the remaining hosts for this loop 44842 1727204530.48341: getting the next task for host managed-node1 44842 1727204530.48348: done getting next task for host managed-node1 44842 1727204530.48350: ^ task is: TASK: meta (flush_handlers) 44842 1727204530.48352: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204530.48357: getting variables 44842 1727204530.48359: in VariableManager get_vars() 44842 1727204530.48402: Calling all_inventory to load vars for managed-node1 44842 1727204530.48407: Calling groups_inventory to load vars for managed-node1 44842 1727204530.48411: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.48426: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.48429: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.48433: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.49184: done sending task result for task 0affcd87-79f5-aad0-d242-00000000066a 44842 1727204530.49187: WORKER PROCESS EXITING 44842 1727204530.49845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.50821: done with get_vars() 44842 1727204530.50841: done getting variables 44842 1727204530.50898: in VariableManager get_vars() 44842 1727204530.50905: Calling all_inventory to load vars for managed-node1 44842 1727204530.50907: Calling groups_inventory to load vars for managed-node1 44842 1727204530.50908: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.50912: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.50913: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.50915: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.51788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.53155: done with get_vars() 44842 1727204530.53180: done queuing things up, now waiting for results queue to drain 44842 1727204530.53182: results queue empty 44842 1727204530.53182: checking for any_errors_fatal 44842 1727204530.53185: done checking for any_errors_fatal 44842 1727204530.53185: checking for max_fail_percentage 44842 1727204530.53186: done checking for max_fail_percentage 44842 1727204530.53190: checking to see if all hosts have failed and the running result is not ok 44842 1727204530.53191: done checking to see if all hosts have failed 44842 1727204530.53191: getting the remaining hosts for this loop 44842 1727204530.53192: done getting the remaining hosts for this loop 44842 1727204530.53194: getting the next task for host managed-node1 44842 1727204530.53197: done getting next task for host managed-node1 44842 1727204530.53199: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 44842 1727204530.53200: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204530.53202: getting variables 44842 1727204530.53202: in VariableManager get_vars() 44842 1727204530.53209: Calling all_inventory to load vars for managed-node1 44842 1727204530.53210: Calling groups_inventory to load vars for managed-node1 44842 1727204530.53212: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.53216: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.53217: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.53219: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.54119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.55615: done with get_vars() 44842 1727204530.55634: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:230 Tuesday 24 September 2024 15:02:10 -0400 (0:00:00.882) 0:00:40.725 ***** 44842 1727204530.55702: entering _queue_task() for managed-node1/include_tasks 44842 1727204530.55942: worker is 1 (out of 1 available) 44842 1727204530.55954: exiting _queue_task() for managed-node1/include_tasks 44842 1727204530.55971: done queuing things up, now waiting for results queue to drain 44842 1727204530.55972: waiting for pending results... 44842 1727204530.56165: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_absent.yml' 44842 1727204530.56232: in run() - task 0affcd87-79f5-aad0-d242-0000000000a9 44842 1727204530.56244: variable 'ansible_search_path' from source: unknown 44842 1727204530.56278: calling self._execute() 44842 1727204530.56355: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.56359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.56370: variable 'omit' from source: magic vars 44842 1727204530.56647: variable 'ansible_distribution_major_version' from source: facts 44842 1727204530.56658: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204530.56665: _execute() done 44842 1727204530.56668: dumping result to json 44842 1727204530.56671: done dumping result, returning 44842 1727204530.56678: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_absent.yml' [0affcd87-79f5-aad0-d242-0000000000a9] 44842 1727204530.56685: sending task result for task 0affcd87-79f5-aad0-d242-0000000000a9 44842 1727204530.56780: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000a9 44842 1727204530.56782: WORKER PROCESS EXITING 44842 1727204530.56812: no more pending results, returning what we have 44842 1727204530.56817: in VariableManager get_vars() 44842 1727204530.56851: Calling all_inventory to load vars for managed-node1 44842 1727204530.56853: Calling groups_inventory to load vars for managed-node1 44842 1727204530.56857: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.56874: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.56877: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.56880: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.57872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.59517: done with get_vars() 44842 1727204530.59539: variable 'ansible_search_path' from source: unknown 44842 1727204530.59554: we have included files to process 44842 1727204530.59555: generating all_blocks data 44842 1727204530.59557: done generating all_blocks data 44842 1727204530.59557: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44842 1727204530.59558: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44842 1727204530.59560: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44842 1727204530.59724: in VariableManager get_vars() 44842 1727204530.59740: done with get_vars() 44842 1727204530.59850: done processing included file 44842 1727204530.59852: iterating over new_blocks loaded from include file 44842 1727204530.59854: in VariableManager get_vars() 44842 1727204530.59868: done with get_vars() 44842 1727204530.59869: filtering new block on tags 44842 1727204530.59887: done filtering new block on tags 44842 1727204530.59890: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node1 44842 1727204530.59895: extending task lists for all hosts with included blocks 44842 1727204530.59965: done extending task lists 44842 1727204530.59966: done processing included files 44842 1727204530.59967: results queue empty 44842 1727204530.59968: checking for any_errors_fatal 44842 1727204530.59970: done checking for any_errors_fatal 44842 1727204530.59970: checking for max_fail_percentage 44842 1727204530.59972: done checking for max_fail_percentage 44842 1727204530.59973: checking to see if all hosts have failed and the running result is not ok 44842 1727204530.59973: done checking to see if all hosts have failed 44842 1727204530.59974: getting the remaining hosts for this loop 44842 1727204530.59975: done getting the remaining hosts for this loop 44842 1727204530.59978: getting the next task for host managed-node1 44842 1727204530.59982: done getting next task for host managed-node1 44842 1727204530.59984: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44842 1727204530.59987: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204530.59989: getting variables 44842 1727204530.59990: in VariableManager get_vars() 44842 1727204530.59998: Calling all_inventory to load vars for managed-node1 44842 1727204530.60000: Calling groups_inventory to load vars for managed-node1 44842 1727204530.60003: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.60007: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.60010: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.60013: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.60792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.61726: done with get_vars() 44842 1727204530.61746: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:02:10 -0400 (0:00:00.060) 0:00:40.786 ***** 44842 1727204530.61802: entering _queue_task() for managed-node1/include_tasks 44842 1727204530.62038: worker is 1 (out of 1 available) 44842 1727204530.62051: exiting _queue_task() for managed-node1/include_tasks 44842 1727204530.62062: done queuing things up, now waiting for results queue to drain 44842 1727204530.62063: waiting for pending results... 44842 1727204530.62277: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 44842 1727204530.62349: in run() - task 0affcd87-79f5-aad0-d242-00000000067b 44842 1727204530.62358: variable 'ansible_search_path' from source: unknown 44842 1727204530.62366: variable 'ansible_search_path' from source: unknown 44842 1727204530.62396: calling self._execute() 44842 1727204530.62475: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.62495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.62519: variable 'omit' from source: magic vars 44842 1727204530.62936: variable 'ansible_distribution_major_version' from source: facts 44842 1727204530.62969: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204530.62980: _execute() done 44842 1727204530.62987: dumping result to json 44842 1727204530.62994: done dumping result, returning 44842 1727204530.63002: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-aad0-d242-00000000067b] 44842 1727204530.63013: sending task result for task 0affcd87-79f5-aad0-d242-00000000067b 44842 1727204530.63142: no more pending results, returning what we have 44842 1727204530.63148: in VariableManager get_vars() 44842 1727204530.63191: Calling all_inventory to load vars for managed-node1 44842 1727204530.63194: Calling groups_inventory to load vars for managed-node1 44842 1727204530.63198: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.63212: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.63215: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.63218: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.63997: done sending task result for task 0affcd87-79f5-aad0-d242-00000000067b 44842 1727204530.64000: WORKER PROCESS EXITING 44842 1727204530.64396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.65358: done with get_vars() 44842 1727204530.65375: variable 'ansible_search_path' from source: unknown 44842 1727204530.65376: variable 'ansible_search_path' from source: unknown 44842 1727204530.65401: we have included files to process 44842 1727204530.65402: generating all_blocks data 44842 1727204530.65404: done generating all_blocks data 44842 1727204530.65404: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44842 1727204530.65405: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44842 1727204530.65406: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44842 1727204530.66111: done processing included file 44842 1727204530.66112: iterating over new_blocks loaded from include file 44842 1727204530.66113: in VariableManager get_vars() 44842 1727204530.66123: done with get_vars() 44842 1727204530.66124: filtering new block on tags 44842 1727204530.66139: done filtering new block on tags 44842 1727204530.66141: in VariableManager get_vars() 44842 1727204530.66149: done with get_vars() 44842 1727204530.66149: filtering new block on tags 44842 1727204530.66163: done filtering new block on tags 44842 1727204530.66165: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 44842 1727204530.66169: extending task lists for all hosts with included blocks 44842 1727204530.66232: done extending task lists 44842 1727204530.66233: done processing included files 44842 1727204530.66234: results queue empty 44842 1727204530.66234: checking for any_errors_fatal 44842 1727204530.66236: done checking for any_errors_fatal 44842 1727204530.66237: checking for max_fail_percentage 44842 1727204530.66237: done checking for max_fail_percentage 44842 1727204530.66238: checking to see if all hosts have failed and the running result is not ok 44842 1727204530.66238: done checking to see if all hosts have failed 44842 1727204530.66239: getting the remaining hosts for this loop 44842 1727204530.66240: done getting the remaining hosts for this loop 44842 1727204530.66242: getting the next task for host managed-node1 44842 1727204530.66244: done getting next task for host managed-node1 44842 1727204530.66246: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44842 1727204530.66248: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204530.66250: getting variables 44842 1727204530.66250: in VariableManager get_vars() 44842 1727204530.66294: Calling all_inventory to load vars for managed-node1 44842 1727204530.66297: Calling groups_inventory to load vars for managed-node1 44842 1727204530.66298: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.66304: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.66305: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.66307: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.66983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.67985: done with get_vars() 44842 1727204530.68001: done getting variables 44842 1727204530.68028: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:02:10 -0400 (0:00:00.062) 0:00:40.848 ***** 44842 1727204530.68053: entering _queue_task() for managed-node1/set_fact 44842 1727204530.68297: worker is 1 (out of 1 available) 44842 1727204530.68312: exiting _queue_task() for managed-node1/set_fact 44842 1727204530.68326: done queuing things up, now waiting for results queue to drain 44842 1727204530.68327: waiting for pending results... 44842 1727204530.68518: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 44842 1727204530.68604: in run() - task 0affcd87-79f5-aad0-d242-00000000068a 44842 1727204530.68614: variable 'ansible_search_path' from source: unknown 44842 1727204530.68617: variable 'ansible_search_path' from source: unknown 44842 1727204530.68644: calling self._execute() 44842 1727204530.68723: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.68726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.68734: variable 'omit' from source: magic vars 44842 1727204530.69019: variable 'ansible_distribution_major_version' from source: facts 44842 1727204530.69031: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204530.69039: variable 'omit' from source: magic vars 44842 1727204530.69073: variable 'omit' from source: magic vars 44842 1727204530.69096: variable 'omit' from source: magic vars 44842 1727204530.69134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204530.69169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204530.69187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204530.69200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204530.69213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204530.69240: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204530.69249: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.69251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.69322: Set connection var ansible_shell_type to sh 44842 1727204530.69331: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204530.69335: Set connection var ansible_connection to ssh 44842 1727204530.69344: Set connection var ansible_pipelining to False 44842 1727204530.69354: Set connection var ansible_timeout to 10 44842 1727204530.69363: Set connection var ansible_shell_executable to /bin/sh 44842 1727204530.69382: variable 'ansible_shell_executable' from source: unknown 44842 1727204530.69385: variable 'ansible_connection' from source: unknown 44842 1727204530.69388: variable 'ansible_module_compression' from source: unknown 44842 1727204530.69391: variable 'ansible_shell_type' from source: unknown 44842 1727204530.69393: variable 'ansible_shell_executable' from source: unknown 44842 1727204530.69396: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.69398: variable 'ansible_pipelining' from source: unknown 44842 1727204530.69401: variable 'ansible_timeout' from source: unknown 44842 1727204530.69403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.69511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204530.69520: variable 'omit' from source: magic vars 44842 1727204530.69526: starting attempt loop 44842 1727204530.69530: running the handler 44842 1727204530.69539: handler run complete 44842 1727204530.69548: attempt loop complete, returning result 44842 1727204530.69551: _execute() done 44842 1727204530.69555: dumping result to json 44842 1727204530.69557: done dumping result, returning 44842 1727204530.69564: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-aad0-d242-00000000068a] 44842 1727204530.69577: sending task result for task 0affcd87-79f5-aad0-d242-00000000068a ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44842 1727204530.69708: no more pending results, returning what we have 44842 1727204530.69712: results queue empty 44842 1727204530.69713: checking for any_errors_fatal 44842 1727204530.69714: done checking for any_errors_fatal 44842 1727204530.69715: checking for max_fail_percentage 44842 1727204530.69717: done checking for max_fail_percentage 44842 1727204530.69718: checking to see if all hosts have failed and the running result is not ok 44842 1727204530.69718: done checking to see if all hosts have failed 44842 1727204530.69719: getting the remaining hosts for this loop 44842 1727204530.69721: done getting the remaining hosts for this loop 44842 1727204530.69725: getting the next task for host managed-node1 44842 1727204530.69732: done getting next task for host managed-node1 44842 1727204530.69734: ^ task is: TASK: Stat profile file 44842 1727204530.69739: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204530.69743: getting variables 44842 1727204530.69744: in VariableManager get_vars() 44842 1727204530.69777: Calling all_inventory to load vars for managed-node1 44842 1727204530.69780: Calling groups_inventory to load vars for managed-node1 44842 1727204530.69790: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204530.69796: done sending task result for task 0affcd87-79f5-aad0-d242-00000000068a 44842 1727204530.69799: WORKER PROCESS EXITING 44842 1727204530.69808: Calling all_plugins_play to load vars for managed-node1 44842 1727204530.69810: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204530.69813: Calling groups_plugins_play to load vars for managed-node1 44842 1727204530.70667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204530.71629: done with get_vars() 44842 1727204530.71651: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:02:10 -0400 (0:00:00.036) 0:00:40.885 ***** 44842 1727204530.71721: entering _queue_task() for managed-node1/stat 44842 1727204530.71962: worker is 1 (out of 1 available) 44842 1727204530.71977: exiting _queue_task() for managed-node1/stat 44842 1727204530.71989: done queuing things up, now waiting for results queue to drain 44842 1727204530.71991: waiting for pending results... 44842 1727204530.72224: running TaskExecutor() for managed-node1/TASK: Stat profile file 44842 1727204530.72288: in run() - task 0affcd87-79f5-aad0-d242-00000000068b 44842 1727204530.72304: variable 'ansible_search_path' from source: unknown 44842 1727204530.72308: variable 'ansible_search_path' from source: unknown 44842 1727204530.72335: calling self._execute() 44842 1727204530.72419: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.72423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.72433: variable 'omit' from source: magic vars 44842 1727204530.72723: variable 'ansible_distribution_major_version' from source: facts 44842 1727204530.72733: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204530.72740: variable 'omit' from source: magic vars 44842 1727204530.72777: variable 'omit' from source: magic vars 44842 1727204530.72848: variable 'profile' from source: include params 44842 1727204530.72852: variable 'interface' from source: set_fact 44842 1727204530.72907: variable 'interface' from source: set_fact 44842 1727204530.72921: variable 'omit' from source: magic vars 44842 1727204530.72958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204530.72996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204530.73012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204530.73026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204530.73040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204530.73067: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204530.73070: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.73073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.73140: Set connection var ansible_shell_type to sh 44842 1727204530.73150: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204530.73155: Set connection var ansible_connection to ssh 44842 1727204530.73165: Set connection var ansible_pipelining to False 44842 1727204530.73171: Set connection var ansible_timeout to 10 44842 1727204530.73177: Set connection var ansible_shell_executable to /bin/sh 44842 1727204530.73193: variable 'ansible_shell_executable' from source: unknown 44842 1727204530.73196: variable 'ansible_connection' from source: unknown 44842 1727204530.73199: variable 'ansible_module_compression' from source: unknown 44842 1727204530.73201: variable 'ansible_shell_type' from source: unknown 44842 1727204530.73203: variable 'ansible_shell_executable' from source: unknown 44842 1727204530.73205: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204530.73210: variable 'ansible_pipelining' from source: unknown 44842 1727204530.73212: variable 'ansible_timeout' from source: unknown 44842 1727204530.73216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204530.73363: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204530.73375: variable 'omit' from source: magic vars 44842 1727204530.73385: starting attempt loop 44842 1727204530.73388: running the handler 44842 1727204530.73399: _low_level_execute_command(): starting 44842 1727204530.73405: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204530.74267: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.74300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.74303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 44842 1727204530.74318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.74324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204530.74338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.74475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204530.74784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204530.74787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204530.74868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204530.76423: stdout chunk (state=3): >>>/root <<< 44842 1727204530.76537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204530.76608: stderr chunk (state=3): >>><<< 44842 1727204530.76612: stdout chunk (state=3): >>><<< 44842 1727204530.76655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204530.76658: _low_level_execute_command(): starting 44842 1727204530.76673: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737 `" && echo ansible-tmp-1727204530.7663648-47830-120790876426737="` echo /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737 `" ) && sleep 0' 44842 1727204530.77121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.77125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.77162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204530.77167: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.77214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204530.77218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204530.77284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204530.79126: stdout chunk (state=3): >>>ansible-tmp-1727204530.7663648-47830-120790876426737=/root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737 <<< 44842 1727204530.79243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204530.79322: stderr chunk (state=3): >>><<< 44842 1727204530.79325: stdout chunk (state=3): >>><<< 44842 1727204530.79372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204530.7663648-47830-120790876426737=/root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204530.79575: variable 'ansible_module_compression' from source: unknown 44842 1727204530.79578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44842 1727204530.79580: variable 'ansible_facts' from source: unknown 44842 1727204530.79606: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737/AnsiballZ_stat.py 44842 1727204530.79762: Sending initial data 44842 1727204530.79775: Sent initial data (153 bytes) 44842 1727204530.80458: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204530.80467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.80496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.80500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.80502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.80565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204530.80571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204530.80573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204530.80623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204530.82466: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204530.82492: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 44842 1727204530.82505: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 44842 1727204530.82516: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 44842 1727204530.82587: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpk1ahu1gb /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737/AnsiballZ_stat.py <<< 44842 1727204530.82654: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204530.83869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204530.84034: stderr chunk (state=3): >>><<< 44842 1727204530.84046: stdout chunk (state=3): >>><<< 44842 1727204530.84157: done transferring module to remote 44842 1727204530.84161: _low_level_execute_command(): starting 44842 1727204530.84163: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737/ /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737/AnsiballZ_stat.py && sleep 0' 44842 1727204530.85068: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204530.85084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204530.85098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.85116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.85173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204530.85186: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204530.85200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.85220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204530.85232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204530.85254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204530.85270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204530.85286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.85311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.85324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204530.85337: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204530.85362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.85440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204530.85473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204530.85490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204530.85589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204530.87373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204530.87377: stdout chunk (state=3): >>><<< 44842 1727204530.87380: stderr chunk (state=3): >>><<< 44842 1727204530.87473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204530.87477: _low_level_execute_command(): starting 44842 1727204530.87480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737/AnsiballZ_stat.py && sleep 0' 44842 1727204530.88008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204530.88023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204530.88039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.88056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.88100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204530.88112: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204530.88126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.88142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204530.88152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204530.88162: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204530.88175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204530.88188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204530.88202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204530.88214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204530.88225: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204530.88238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204530.88380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204530.88404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204530.88434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204530.88597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.01474: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44842 1727204531.02478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204531.02482: stdout chunk (state=3): >>><<< 44842 1727204531.02485: stderr chunk (state=3): >>><<< 44842 1727204531.02620: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204531.02625: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204531.02629: _low_level_execute_command(): starting 44842 1727204531.02631: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204530.7663648-47830-120790876426737/ > /dev/null 2>&1 && sleep 0' 44842 1727204531.03228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204531.03245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.03271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.03291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.03334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.03351: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204531.03373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.03392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204531.03405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204531.03418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204531.03431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.03447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.03469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.03481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.03491: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204531.03503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.03588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204531.03606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204531.03621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204531.03710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.05545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204531.05549: stdout chunk (state=3): >>><<< 44842 1727204531.05551: stderr chunk (state=3): >>><<< 44842 1727204531.05680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204531.05683: handler run complete 44842 1727204531.05686: attempt loop complete, returning result 44842 1727204531.05688: _execute() done 44842 1727204531.05690: dumping result to json 44842 1727204531.05692: done dumping result, returning 44842 1727204531.05694: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-aad0-d242-00000000068b] 44842 1727204531.05695: sending task result for task 0affcd87-79f5-aad0-d242-00000000068b 44842 1727204531.05782: done sending task result for task 0affcd87-79f5-aad0-d242-00000000068b 44842 1727204531.05785: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 44842 1727204531.05846: no more pending results, returning what we have 44842 1727204531.05850: results queue empty 44842 1727204531.05851: checking for any_errors_fatal 44842 1727204531.05859: done checking for any_errors_fatal 44842 1727204531.05860: checking for max_fail_percentage 44842 1727204531.05862: done checking for max_fail_percentage 44842 1727204531.05863: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.05869: done checking to see if all hosts have failed 44842 1727204531.05870: getting the remaining hosts for this loop 44842 1727204531.05873: done getting the remaining hosts for this loop 44842 1727204531.05878: getting the next task for host managed-node1 44842 1727204531.05886: done getting next task for host managed-node1 44842 1727204531.05889: ^ task is: TASK: Set NM profile exist flag based on the profile files 44842 1727204531.05894: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.05899: getting variables 44842 1727204531.05901: in VariableManager get_vars() 44842 1727204531.05936: Calling all_inventory to load vars for managed-node1 44842 1727204531.05939: Calling groups_inventory to load vars for managed-node1 44842 1727204531.05943: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.05956: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.05959: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.05963: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.12050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.13459: done with get_vars() 44842 1727204531.13483: done getting variables 44842 1727204531.13521: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.418) 0:00:41.303 ***** 44842 1727204531.13547: entering _queue_task() for managed-node1/set_fact 44842 1727204531.13798: worker is 1 (out of 1 available) 44842 1727204531.13811: exiting _queue_task() for managed-node1/set_fact 44842 1727204531.13821: done queuing things up, now waiting for results queue to drain 44842 1727204531.13823: waiting for pending results... 44842 1727204531.13997: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 44842 1727204531.14081: in run() - task 0affcd87-79f5-aad0-d242-00000000068c 44842 1727204531.14091: variable 'ansible_search_path' from source: unknown 44842 1727204531.14095: variable 'ansible_search_path' from source: unknown 44842 1727204531.14123: calling self._execute() 44842 1727204531.14211: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.14215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.14224: variable 'omit' from source: magic vars 44842 1727204531.14514: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.14525: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.14608: variable 'profile_stat' from source: set_fact 44842 1727204531.14624: Evaluated conditional (profile_stat.stat.exists): False 44842 1727204531.14628: when evaluation is False, skipping this task 44842 1727204531.14631: _execute() done 44842 1727204531.14633: dumping result to json 44842 1727204531.14635: done dumping result, returning 44842 1727204531.14641: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-aad0-d242-00000000068c] 44842 1727204531.14647: sending task result for task 0affcd87-79f5-aad0-d242-00000000068c 44842 1727204531.14733: done sending task result for task 0affcd87-79f5-aad0-d242-00000000068c 44842 1727204531.14736: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44842 1727204531.14782: no more pending results, returning what we have 44842 1727204531.14785: results queue empty 44842 1727204531.14786: checking for any_errors_fatal 44842 1727204531.14795: done checking for any_errors_fatal 44842 1727204531.14796: checking for max_fail_percentage 44842 1727204531.14797: done checking for max_fail_percentage 44842 1727204531.14798: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.14799: done checking to see if all hosts have failed 44842 1727204531.14800: getting the remaining hosts for this loop 44842 1727204531.14802: done getting the remaining hosts for this loop 44842 1727204531.14806: getting the next task for host managed-node1 44842 1727204531.14814: done getting next task for host managed-node1 44842 1727204531.14816: ^ task is: TASK: Get NM profile info 44842 1727204531.14820: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.14825: getting variables 44842 1727204531.14827: in VariableManager get_vars() 44842 1727204531.14862: Calling all_inventory to load vars for managed-node1 44842 1727204531.14866: Calling groups_inventory to load vars for managed-node1 44842 1727204531.14871: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.14883: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.14885: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.14888: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.15716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.16677: done with get_vars() 44842 1727204531.16693: done getting variables 44842 1727204531.16759: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.032) 0:00:41.335 ***** 44842 1727204531.16784: entering _queue_task() for managed-node1/shell 44842 1727204531.16785: Creating lock for shell 44842 1727204531.17029: worker is 1 (out of 1 available) 44842 1727204531.17103: exiting _queue_task() for managed-node1/shell 44842 1727204531.17115: done queuing things up, now waiting for results queue to drain 44842 1727204531.17117: waiting for pending results... 44842 1727204531.17303: running TaskExecutor() for managed-node1/TASK: Get NM profile info 44842 1727204531.17391: in run() - task 0affcd87-79f5-aad0-d242-00000000068d 44842 1727204531.17401: variable 'ansible_search_path' from source: unknown 44842 1727204531.17404: variable 'ansible_search_path' from source: unknown 44842 1727204531.17433: calling self._execute() 44842 1727204531.17514: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.17517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.17527: variable 'omit' from source: magic vars 44842 1727204531.17828: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.17839: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.17844: variable 'omit' from source: magic vars 44842 1727204531.17881: variable 'omit' from source: magic vars 44842 1727204531.17951: variable 'profile' from source: include params 44842 1727204531.17955: variable 'interface' from source: set_fact 44842 1727204531.18009: variable 'interface' from source: set_fact 44842 1727204531.18023: variable 'omit' from source: magic vars 44842 1727204531.18059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204531.18088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204531.18104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204531.18123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204531.18132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204531.18155: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204531.18159: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.18162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.18232: Set connection var ansible_shell_type to sh 44842 1727204531.18241: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204531.18246: Set connection var ansible_connection to ssh 44842 1727204531.18251: Set connection var ansible_pipelining to False 44842 1727204531.18256: Set connection var ansible_timeout to 10 44842 1727204531.18266: Set connection var ansible_shell_executable to /bin/sh 44842 1727204531.18284: variable 'ansible_shell_executable' from source: unknown 44842 1727204531.18287: variable 'ansible_connection' from source: unknown 44842 1727204531.18290: variable 'ansible_module_compression' from source: unknown 44842 1727204531.18292: variable 'ansible_shell_type' from source: unknown 44842 1727204531.18295: variable 'ansible_shell_executable' from source: unknown 44842 1727204531.18297: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.18299: variable 'ansible_pipelining' from source: unknown 44842 1727204531.18302: variable 'ansible_timeout' from source: unknown 44842 1727204531.18316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.18478: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204531.18512: variable 'omit' from source: magic vars 44842 1727204531.18522: starting attempt loop 44842 1727204531.18528: running the handler 44842 1727204531.18540: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204531.18567: _low_level_execute_command(): starting 44842 1727204531.18579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204531.19333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204531.19350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.19372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.19393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.19436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.19452: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204531.19473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.19494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204531.19506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204531.19516: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204531.19527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.19541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.19557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.19576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.19587: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204531.19601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.19682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204531.19709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204531.19742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204531.19832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.21370: stdout chunk (state=3): >>>/root <<< 44842 1727204531.21473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204531.21532: stderr chunk (state=3): >>><<< 44842 1727204531.21541: stdout chunk (state=3): >>><<< 44842 1727204531.21571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204531.21582: _low_level_execute_command(): starting 44842 1727204531.21590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459 `" && echo ansible-tmp-1727204531.2157013-47907-69071148339459="` echo /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459 `" ) && sleep 0' 44842 1727204531.22185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204531.22194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.22204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.22217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.22260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.22277: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204531.22287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.22299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204531.22307: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204531.22313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204531.22321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.22330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.22342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.22352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.22361: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204531.22379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.22449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204531.22469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204531.22490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204531.22572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.24409: stdout chunk (state=3): >>>ansible-tmp-1727204531.2157013-47907-69071148339459=/root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459 <<< 44842 1727204531.24580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204531.24623: stderr chunk (state=3): >>><<< 44842 1727204531.24626: stdout chunk (state=3): >>><<< 44842 1727204531.24880: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204531.2157013-47907-69071148339459=/root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204531.24888: variable 'ansible_module_compression' from source: unknown 44842 1727204531.24890: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204531.24893: variable 'ansible_facts' from source: unknown 44842 1727204531.24895: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459/AnsiballZ_command.py 44842 1727204531.25326: Sending initial data 44842 1727204531.25330: Sent initial data (155 bytes) 44842 1727204531.26484: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204531.26505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.26520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.26536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.26579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.26591: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204531.26605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.26631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204531.26644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204531.26655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204531.26674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.26689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.26706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.26725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.26738: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204531.26751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.26821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204531.26848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204531.26868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204531.26983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.28651: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204531.28702: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204531.28752: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpei1ohk75 /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459/AnsiballZ_command.py <<< 44842 1727204531.28799: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204531.30276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204531.30407: stderr chunk (state=3): >>><<< 44842 1727204531.30410: stdout chunk (state=3): >>><<< 44842 1727204531.30413: done transferring module to remote 44842 1727204531.30415: _low_level_execute_command(): starting 44842 1727204531.30418: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459/ /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459/AnsiballZ_command.py && sleep 0' 44842 1727204531.31100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.32057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.32076: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204531.32090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.32106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204531.32117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204531.32131: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204531.32143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.32154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.32173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.32185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.32194: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204531.32205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.32284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204531.32307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204531.32323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204531.32402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.34194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204531.34198: stdout chunk (state=3): >>><<< 44842 1727204531.34201: stderr chunk (state=3): >>><<< 44842 1727204531.34298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204531.34303: _low_level_execute_command(): starting 44842 1727204531.34305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459/AnsiballZ_command.py && sleep 0' 44842 1727204531.35104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.35108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.35140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.35144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.35146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 44842 1727204531.35160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.35211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204531.35215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204531.35287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.50118: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:02:11.482149", "end": "2024-09-24 15:02:11.500330", "delta": "0:00:00.018181", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204531.51262: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. <<< 44842 1727204531.51268: stdout chunk (state=3): >>><<< 44842 1727204531.51270: stderr chunk (state=3): >>><<< 44842 1727204531.51381: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:02:11.482149", "end": "2024-09-24 15:02:11.500330", "delta": "0:00:00.018181", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. 44842 1727204531.51393: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204531.51396: _low_level_execute_command(): starting 44842 1727204531.51398: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204531.2157013-47907-69071148339459/ > /dev/null 2>&1 && sleep 0' 44842 1727204531.52062: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204531.52079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.52093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.52110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.52157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.52177: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204531.52193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.52209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204531.52220: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204531.52232: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204531.52249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204531.52262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204531.52284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204531.52296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204531.52308: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204531.52322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204531.52406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204531.52422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204531.52437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204531.52545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204531.54295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204531.54350: stderr chunk (state=3): >>><<< 44842 1727204531.54353: stdout chunk (state=3): >>><<< 44842 1727204531.54437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204531.54441: handler run complete 44842 1727204531.54445: Evaluated conditional (False): False 44842 1727204531.54448: attempt loop complete, returning result 44842 1727204531.54450: _execute() done 44842 1727204531.54452: dumping result to json 44842 1727204531.54454: done dumping result, returning 44842 1727204531.54456: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-aad0-d242-00000000068d] 44842 1727204531.54458: sending task result for task 0affcd87-79f5-aad0-d242-00000000068d 44842 1727204531.54524: done sending task result for task 0affcd87-79f5-aad0-d242-00000000068d 44842 1727204531.54527: WORKER PROCESS EXITING fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.018181", "end": "2024-09-24 15:02:11.500330", "rc": 1, "start": "2024-09-24 15:02:11.482149" } MSG: non-zero return code ...ignoring 44842 1727204531.54598: no more pending results, returning what we have 44842 1727204531.54602: results queue empty 44842 1727204531.54602: checking for any_errors_fatal 44842 1727204531.54609: done checking for any_errors_fatal 44842 1727204531.54609: checking for max_fail_percentage 44842 1727204531.54611: done checking for max_fail_percentage 44842 1727204531.54612: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.54613: done checking to see if all hosts have failed 44842 1727204531.54613: getting the remaining hosts for this loop 44842 1727204531.54615: done getting the remaining hosts for this loop 44842 1727204531.54619: getting the next task for host managed-node1 44842 1727204531.54625: done getting next task for host managed-node1 44842 1727204531.54628: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44842 1727204531.54631: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.54635: getting variables 44842 1727204531.54636: in VariableManager get_vars() 44842 1727204531.54669: Calling all_inventory to load vars for managed-node1 44842 1727204531.54672: Calling groups_inventory to load vars for managed-node1 44842 1727204531.54675: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.54685: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.54687: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.54690: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.56156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.57671: done with get_vars() 44842 1727204531.57692: done getting variables 44842 1727204531.57741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.409) 0:00:41.745 ***** 44842 1727204531.57769: entering _queue_task() for managed-node1/set_fact 44842 1727204531.58008: worker is 1 (out of 1 available) 44842 1727204531.58020: exiting _queue_task() for managed-node1/set_fact 44842 1727204531.58032: done queuing things up, now waiting for results queue to drain 44842 1727204531.58033: waiting for pending results... 44842 1727204531.58216: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44842 1727204531.58306: in run() - task 0affcd87-79f5-aad0-d242-00000000068e 44842 1727204531.58317: variable 'ansible_search_path' from source: unknown 44842 1727204531.58321: variable 'ansible_search_path' from source: unknown 44842 1727204531.58354: calling self._execute() 44842 1727204531.58436: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.58441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.58449: variable 'omit' from source: magic vars 44842 1727204531.58743: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.58754: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.58851: variable 'nm_profile_exists' from source: set_fact 44842 1727204531.58867: Evaluated conditional (nm_profile_exists.rc == 0): False 44842 1727204531.58870: when evaluation is False, skipping this task 44842 1727204531.58873: _execute() done 44842 1727204531.58875: dumping result to json 44842 1727204531.58878: done dumping result, returning 44842 1727204531.58881: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-aad0-d242-00000000068e] 44842 1727204531.58887: sending task result for task 0affcd87-79f5-aad0-d242-00000000068e 44842 1727204531.58977: done sending task result for task 0affcd87-79f5-aad0-d242-00000000068e 44842 1727204531.58980: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 44842 1727204531.59055: no more pending results, returning what we have 44842 1727204531.59059: results queue empty 44842 1727204531.59062: checking for any_errors_fatal 44842 1727204531.59071: done checking for any_errors_fatal 44842 1727204531.59072: checking for max_fail_percentage 44842 1727204531.59074: done checking for max_fail_percentage 44842 1727204531.59075: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.59076: done checking to see if all hosts have failed 44842 1727204531.59077: getting the remaining hosts for this loop 44842 1727204531.59079: done getting the remaining hosts for this loop 44842 1727204531.59083: getting the next task for host managed-node1 44842 1727204531.59097: done getting next task for host managed-node1 44842 1727204531.59100: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44842 1727204531.59104: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.59108: getting variables 44842 1727204531.59110: in VariableManager get_vars() 44842 1727204531.59141: Calling all_inventory to load vars for managed-node1 44842 1727204531.59144: Calling groups_inventory to load vars for managed-node1 44842 1727204531.59148: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.59166: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.59169: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.59173: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.60471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.61494: done with get_vars() 44842 1727204531.61513: done getting variables 44842 1727204531.61556: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204531.61653: variable 'profile' from source: include params 44842 1727204531.61656: variable 'interface' from source: set_fact 44842 1727204531.61707: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.039) 0:00:41.785 ***** 44842 1727204531.61732: entering _queue_task() for managed-node1/command 44842 1727204531.61978: worker is 1 (out of 1 available) 44842 1727204531.61991: exiting _queue_task() for managed-node1/command 44842 1727204531.62004: done queuing things up, now waiting for results queue to drain 44842 1727204531.62005: waiting for pending results... 44842 1727204531.62185: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 44842 1727204531.62269: in run() - task 0affcd87-79f5-aad0-d242-000000000690 44842 1727204531.62278: variable 'ansible_search_path' from source: unknown 44842 1727204531.62282: variable 'ansible_search_path' from source: unknown 44842 1727204531.62311: calling self._execute() 44842 1727204531.62419: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.62423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.62430: variable 'omit' from source: magic vars 44842 1727204531.62723: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.62734: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.62821: variable 'profile_stat' from source: set_fact 44842 1727204531.62831: Evaluated conditional (profile_stat.stat.exists): False 44842 1727204531.62834: when evaluation is False, skipping this task 44842 1727204531.62837: _execute() done 44842 1727204531.62840: dumping result to json 44842 1727204531.62842: done dumping result, returning 44842 1727204531.62849: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0affcd87-79f5-aad0-d242-000000000690] 44842 1727204531.62855: sending task result for task 0affcd87-79f5-aad0-d242-000000000690 44842 1727204531.62943: done sending task result for task 0affcd87-79f5-aad0-d242-000000000690 44842 1727204531.62946: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44842 1727204531.63007: no more pending results, returning what we have 44842 1727204531.63011: results queue empty 44842 1727204531.63012: checking for any_errors_fatal 44842 1727204531.63020: done checking for any_errors_fatal 44842 1727204531.63023: checking for max_fail_percentage 44842 1727204531.63025: done checking for max_fail_percentage 44842 1727204531.63026: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.63027: done checking to see if all hosts have failed 44842 1727204531.63028: getting the remaining hosts for this loop 44842 1727204531.63029: done getting the remaining hosts for this loop 44842 1727204531.63034: getting the next task for host managed-node1 44842 1727204531.63044: done getting next task for host managed-node1 44842 1727204531.63046: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44842 1727204531.63051: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.63063: getting variables 44842 1727204531.63067: in VariableManager get_vars() 44842 1727204531.63095: Calling all_inventory to load vars for managed-node1 44842 1727204531.63098: Calling groups_inventory to load vars for managed-node1 44842 1727204531.63101: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.63111: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.63113: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.63116: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.64090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.65087: done with get_vars() 44842 1727204531.65104: done getting variables 44842 1727204531.65149: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204531.65233: variable 'profile' from source: include params 44842 1727204531.65236: variable 'interface' from source: set_fact 44842 1727204531.65285: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.035) 0:00:41.821 ***** 44842 1727204531.65315: entering _queue_task() for managed-node1/set_fact 44842 1727204531.65547: worker is 1 (out of 1 available) 44842 1727204531.65562: exiting _queue_task() for managed-node1/set_fact 44842 1727204531.65575: done queuing things up, now waiting for results queue to drain 44842 1727204531.65577: waiting for pending results... 44842 1727204531.65773: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 44842 1727204531.65851: in run() - task 0affcd87-79f5-aad0-d242-000000000691 44842 1727204531.65862: variable 'ansible_search_path' from source: unknown 44842 1727204531.65868: variable 'ansible_search_path' from source: unknown 44842 1727204531.65897: calling self._execute() 44842 1727204531.65976: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.65979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.65988: variable 'omit' from source: magic vars 44842 1727204531.66382: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.66402: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.66538: variable 'profile_stat' from source: set_fact 44842 1727204531.66572: Evaluated conditional (profile_stat.stat.exists): False 44842 1727204531.66582: when evaluation is False, skipping this task 44842 1727204531.66589: _execute() done 44842 1727204531.66596: dumping result to json 44842 1727204531.66603: done dumping result, returning 44842 1727204531.66611: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0affcd87-79f5-aad0-d242-000000000691] 44842 1727204531.66622: sending task result for task 0affcd87-79f5-aad0-d242-000000000691 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44842 1727204531.66782: no more pending results, returning what we have 44842 1727204531.66787: results queue empty 44842 1727204531.66788: checking for any_errors_fatal 44842 1727204531.66794: done checking for any_errors_fatal 44842 1727204531.66795: checking for max_fail_percentage 44842 1727204531.66797: done checking for max_fail_percentage 44842 1727204531.66798: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.66799: done checking to see if all hosts have failed 44842 1727204531.66800: getting the remaining hosts for this loop 44842 1727204531.66802: done getting the remaining hosts for this loop 44842 1727204531.66806: getting the next task for host managed-node1 44842 1727204531.66814: done getting next task for host managed-node1 44842 1727204531.66817: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44842 1727204531.66822: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.66826: getting variables 44842 1727204531.66828: in VariableManager get_vars() 44842 1727204531.66873: Calling all_inventory to load vars for managed-node1 44842 1727204531.66877: Calling groups_inventory to load vars for managed-node1 44842 1727204531.66881: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.66895: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.66897: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.66901: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.67821: done sending task result for task 0affcd87-79f5-aad0-d242-000000000691 44842 1727204531.67825: WORKER PROCESS EXITING 44842 1727204531.68849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.70928: done with get_vars() 44842 1727204531.70958: done getting variables 44842 1727204531.71037: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204531.71179: variable 'profile' from source: include params 44842 1727204531.71183: variable 'interface' from source: set_fact 44842 1727204531.71257: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.059) 0:00:41.881 ***** 44842 1727204531.71294: entering _queue_task() for managed-node1/command 44842 1727204531.71676: worker is 1 (out of 1 available) 44842 1727204531.71689: exiting _queue_task() for managed-node1/command 44842 1727204531.71702: done queuing things up, now waiting for results queue to drain 44842 1727204531.71703: waiting for pending results... 44842 1727204531.72028: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 44842 1727204531.72179: in run() - task 0affcd87-79f5-aad0-d242-000000000692 44842 1727204531.72205: variable 'ansible_search_path' from source: unknown 44842 1727204531.72216: variable 'ansible_search_path' from source: unknown 44842 1727204531.72269: calling self._execute() 44842 1727204531.72400: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.72416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.72437: variable 'omit' from source: magic vars 44842 1727204531.72872: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.72892: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.73040: variable 'profile_stat' from source: set_fact 44842 1727204531.73070: Evaluated conditional (profile_stat.stat.exists): False 44842 1727204531.73084: when evaluation is False, skipping this task 44842 1727204531.73092: _execute() done 44842 1727204531.73098: dumping result to json 44842 1727204531.73106: done dumping result, returning 44842 1727204531.73114: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0affcd87-79f5-aad0-d242-000000000692] 44842 1727204531.73130: sending task result for task 0affcd87-79f5-aad0-d242-000000000692 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44842 1727204531.73297: no more pending results, returning what we have 44842 1727204531.73301: results queue empty 44842 1727204531.73303: checking for any_errors_fatal 44842 1727204531.73309: done checking for any_errors_fatal 44842 1727204531.73310: checking for max_fail_percentage 44842 1727204531.73312: done checking for max_fail_percentage 44842 1727204531.73313: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.73314: done checking to see if all hosts have failed 44842 1727204531.73315: getting the remaining hosts for this loop 44842 1727204531.73317: done getting the remaining hosts for this loop 44842 1727204531.73322: getting the next task for host managed-node1 44842 1727204531.73330: done getting next task for host managed-node1 44842 1727204531.73333: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44842 1727204531.73338: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.73344: getting variables 44842 1727204531.73346: in VariableManager get_vars() 44842 1727204531.73383: Calling all_inventory to load vars for managed-node1 44842 1727204531.73387: Calling groups_inventory to load vars for managed-node1 44842 1727204531.73391: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.73405: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.73408: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.73411: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.74418: done sending task result for task 0affcd87-79f5-aad0-d242-000000000692 44842 1727204531.74422: WORKER PROCESS EXITING 44842 1727204531.75327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.77136: done with get_vars() 44842 1727204531.77175: done getting variables 44842 1727204531.77242: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204531.77369: variable 'profile' from source: include params 44842 1727204531.77374: variable 'interface' from source: set_fact 44842 1727204531.77435: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.061) 0:00:41.942 ***** 44842 1727204531.77473: entering _queue_task() for managed-node1/set_fact 44842 1727204531.77810: worker is 1 (out of 1 available) 44842 1727204531.77824: exiting _queue_task() for managed-node1/set_fact 44842 1727204531.77836: done queuing things up, now waiting for results queue to drain 44842 1727204531.77837: waiting for pending results... 44842 1727204531.78138: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 44842 1727204531.78274: in run() - task 0affcd87-79f5-aad0-d242-000000000693 44842 1727204531.78298: variable 'ansible_search_path' from source: unknown 44842 1727204531.78307: variable 'ansible_search_path' from source: unknown 44842 1727204531.78350: calling self._execute() 44842 1727204531.78484: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.78497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.78512: variable 'omit' from source: magic vars 44842 1727204531.78919: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.78944: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.79085: variable 'profile_stat' from source: set_fact 44842 1727204531.79106: Evaluated conditional (profile_stat.stat.exists): False 44842 1727204531.79115: when evaluation is False, skipping this task 44842 1727204531.79123: _execute() done 44842 1727204531.79130: dumping result to json 44842 1727204531.79138: done dumping result, returning 44842 1727204531.79147: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0affcd87-79f5-aad0-d242-000000000693] 44842 1727204531.79168: sending task result for task 0affcd87-79f5-aad0-d242-000000000693 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44842 1727204531.79328: no more pending results, returning what we have 44842 1727204531.79333: results queue empty 44842 1727204531.79334: checking for any_errors_fatal 44842 1727204531.79340: done checking for any_errors_fatal 44842 1727204531.79341: checking for max_fail_percentage 44842 1727204531.79343: done checking for max_fail_percentage 44842 1727204531.79344: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.79345: done checking to see if all hosts have failed 44842 1727204531.79345: getting the remaining hosts for this loop 44842 1727204531.79348: done getting the remaining hosts for this loop 44842 1727204531.79352: getting the next task for host managed-node1 44842 1727204531.79367: done getting next task for host managed-node1 44842 1727204531.79371: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 44842 1727204531.79375: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.79380: getting variables 44842 1727204531.79382: in VariableManager get_vars() 44842 1727204531.79416: Calling all_inventory to load vars for managed-node1 44842 1727204531.79419: Calling groups_inventory to load vars for managed-node1 44842 1727204531.79423: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.79437: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.79441: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.79444: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.80485: done sending task result for task 0affcd87-79f5-aad0-d242-000000000693 44842 1727204531.80489: WORKER PROCESS EXITING 44842 1727204531.81405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.82636: done with get_vars() 44842 1727204531.82665: done getting variables 44842 1727204531.82711: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204531.82804: variable 'profile' from source: include params 44842 1727204531.82807: variable 'interface' from source: set_fact 44842 1727204531.82848: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.054) 0:00:41.996 ***** 44842 1727204531.82878: entering _queue_task() for managed-node1/assert 44842 1727204531.83119: worker is 1 (out of 1 available) 44842 1727204531.83133: exiting _queue_task() for managed-node1/assert 44842 1727204531.83146: done queuing things up, now waiting for results queue to drain 44842 1727204531.83147: waiting for pending results... 44842 1727204531.83333: running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' 44842 1727204531.83422: in run() - task 0affcd87-79f5-aad0-d242-00000000067c 44842 1727204531.83433: variable 'ansible_search_path' from source: unknown 44842 1727204531.83436: variable 'ansible_search_path' from source: unknown 44842 1727204531.83471: calling self._execute() 44842 1727204531.83553: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.83556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.83570: variable 'omit' from source: magic vars 44842 1727204531.83845: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.83857: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.83867: variable 'omit' from source: magic vars 44842 1727204531.83896: variable 'omit' from source: magic vars 44842 1727204531.83969: variable 'profile' from source: include params 44842 1727204531.83973: variable 'interface' from source: set_fact 44842 1727204531.84019: variable 'interface' from source: set_fact 44842 1727204531.84032: variable 'omit' from source: magic vars 44842 1727204531.84075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204531.84103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204531.84121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204531.84135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204531.84146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204531.84176: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204531.84180: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.84182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.84247: Set connection var ansible_shell_type to sh 44842 1727204531.84256: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204531.84261: Set connection var ansible_connection to ssh 44842 1727204531.84270: Set connection var ansible_pipelining to False 44842 1727204531.84275: Set connection var ansible_timeout to 10 44842 1727204531.84281: Set connection var ansible_shell_executable to /bin/sh 44842 1727204531.84300: variable 'ansible_shell_executable' from source: unknown 44842 1727204531.84303: variable 'ansible_connection' from source: unknown 44842 1727204531.84306: variable 'ansible_module_compression' from source: unknown 44842 1727204531.84308: variable 'ansible_shell_type' from source: unknown 44842 1727204531.84311: variable 'ansible_shell_executable' from source: unknown 44842 1727204531.84313: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.84315: variable 'ansible_pipelining' from source: unknown 44842 1727204531.84317: variable 'ansible_timeout' from source: unknown 44842 1727204531.84322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.84425: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204531.84435: variable 'omit' from source: magic vars 44842 1727204531.84440: starting attempt loop 44842 1727204531.84443: running the handler 44842 1727204531.84529: variable 'lsr_net_profile_exists' from source: set_fact 44842 1727204531.84533: Evaluated conditional (not lsr_net_profile_exists): True 44842 1727204531.84539: handler run complete 44842 1727204531.84550: attempt loop complete, returning result 44842 1727204531.84553: _execute() done 44842 1727204531.84557: dumping result to json 44842 1727204531.84563: done dumping result, returning 44842 1727204531.84574: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' [0affcd87-79f5-aad0-d242-00000000067c] 44842 1727204531.84576: sending task result for task 0affcd87-79f5-aad0-d242-00000000067c 44842 1727204531.84671: done sending task result for task 0affcd87-79f5-aad0-d242-00000000067c 44842 1727204531.84675: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204531.84720: no more pending results, returning what we have 44842 1727204531.84723: results queue empty 44842 1727204531.84724: checking for any_errors_fatal 44842 1727204531.84737: done checking for any_errors_fatal 44842 1727204531.84738: checking for max_fail_percentage 44842 1727204531.84740: done checking for max_fail_percentage 44842 1727204531.84741: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.84742: done checking to see if all hosts have failed 44842 1727204531.84743: getting the remaining hosts for this loop 44842 1727204531.84744: done getting the remaining hosts for this loop 44842 1727204531.84748: getting the next task for host managed-node1 44842 1727204531.84757: done getting next task for host managed-node1 44842 1727204531.84762: ^ task is: TASK: Include the task 'assert_device_absent.yml' 44842 1727204531.84765: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.84769: getting variables 44842 1727204531.84771: in VariableManager get_vars() 44842 1727204531.84801: Calling all_inventory to load vars for managed-node1 44842 1727204531.84804: Calling groups_inventory to load vars for managed-node1 44842 1727204531.84808: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.84818: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.84820: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.84822: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.85686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.86652: done with get_vars() 44842 1727204531.86674: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:234 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.038) 0:00:42.035 ***** 44842 1727204531.86745: entering _queue_task() for managed-node1/include_tasks 44842 1727204531.86999: worker is 1 (out of 1 available) 44842 1727204531.87013: exiting _queue_task() for managed-node1/include_tasks 44842 1727204531.87025: done queuing things up, now waiting for results queue to drain 44842 1727204531.87027: waiting for pending results... 44842 1727204531.87229: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_absent.yml' 44842 1727204531.87303: in run() - task 0affcd87-79f5-aad0-d242-0000000000aa 44842 1727204531.87313: variable 'ansible_search_path' from source: unknown 44842 1727204531.87346: calling self._execute() 44842 1727204531.87430: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.87433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.87441: variable 'omit' from source: magic vars 44842 1727204531.87726: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.87737: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.87743: _execute() done 44842 1727204531.87746: dumping result to json 44842 1727204531.87750: done dumping result, returning 44842 1727204531.87755: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_absent.yml' [0affcd87-79f5-aad0-d242-0000000000aa] 44842 1727204531.87761: sending task result for task 0affcd87-79f5-aad0-d242-0000000000aa 44842 1727204531.87850: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000aa 44842 1727204531.87853: WORKER PROCESS EXITING 44842 1727204531.87887: no more pending results, returning what we have 44842 1727204531.87892: in VariableManager get_vars() 44842 1727204531.87926: Calling all_inventory to load vars for managed-node1 44842 1727204531.87929: Calling groups_inventory to load vars for managed-node1 44842 1727204531.87932: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.87945: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.87948: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.87951: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.89210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.90718: done with get_vars() 44842 1727204531.90742: variable 'ansible_search_path' from source: unknown 44842 1727204531.90755: we have included files to process 44842 1727204531.90755: generating all_blocks data 44842 1727204531.90757: done generating all_blocks data 44842 1727204531.90763: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44842 1727204531.90765: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44842 1727204531.90767: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44842 1727204531.90888: in VariableManager get_vars() 44842 1727204531.90899: done with get_vars() 44842 1727204531.90982: done processing included file 44842 1727204531.90984: iterating over new_blocks loaded from include file 44842 1727204531.90985: in VariableManager get_vars() 44842 1727204531.90994: done with get_vars() 44842 1727204531.90995: filtering new block on tags 44842 1727204531.91007: done filtering new block on tags 44842 1727204531.91009: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node1 44842 1727204531.91013: extending task lists for all hosts with included blocks 44842 1727204531.91115: done extending task lists 44842 1727204531.91116: done processing included files 44842 1727204531.91116: results queue empty 44842 1727204531.91117: checking for any_errors_fatal 44842 1727204531.91119: done checking for any_errors_fatal 44842 1727204531.91120: checking for max_fail_percentage 44842 1727204531.91120: done checking for max_fail_percentage 44842 1727204531.91121: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.91122: done checking to see if all hosts have failed 44842 1727204531.91122: getting the remaining hosts for this loop 44842 1727204531.91123: done getting the remaining hosts for this loop 44842 1727204531.91125: getting the next task for host managed-node1 44842 1727204531.91127: done getting next task for host managed-node1 44842 1727204531.91129: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44842 1727204531.91130: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.91132: getting variables 44842 1727204531.91133: in VariableManager get_vars() 44842 1727204531.91139: Calling all_inventory to load vars for managed-node1 44842 1727204531.91140: Calling groups_inventory to load vars for managed-node1 44842 1727204531.91142: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.91146: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.91147: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.91149: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.91870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.92870: done with get_vars() 44842 1727204531.92887: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.061) 0:00:42.097 ***** 44842 1727204531.92945: entering _queue_task() for managed-node1/include_tasks 44842 1727204531.93190: worker is 1 (out of 1 available) 44842 1727204531.93203: exiting _queue_task() for managed-node1/include_tasks 44842 1727204531.93216: done queuing things up, now waiting for results queue to drain 44842 1727204531.93217: waiting for pending results... 44842 1727204531.93411: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 44842 1727204531.93489: in run() - task 0affcd87-79f5-aad0-d242-0000000006c4 44842 1727204531.93499: variable 'ansible_search_path' from source: unknown 44842 1727204531.93502: variable 'ansible_search_path' from source: unknown 44842 1727204531.93743: calling self._execute() 44842 1727204531.93748: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204531.93751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204531.93807: variable 'omit' from source: magic vars 44842 1727204531.94039: variable 'ansible_distribution_major_version' from source: facts 44842 1727204531.94054: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204531.94063: _execute() done 44842 1727204531.94067: dumping result to json 44842 1727204531.94069: done dumping result, returning 44842 1727204531.94074: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-aad0-d242-0000000006c4] 44842 1727204531.94079: sending task result for task 0affcd87-79f5-aad0-d242-0000000006c4 44842 1727204531.94166: done sending task result for task 0affcd87-79f5-aad0-d242-0000000006c4 44842 1727204531.94169: WORKER PROCESS EXITING 44842 1727204531.94200: no more pending results, returning what we have 44842 1727204531.94206: in VariableManager get_vars() 44842 1727204531.94239: Calling all_inventory to load vars for managed-node1 44842 1727204531.94242: Calling groups_inventory to load vars for managed-node1 44842 1727204531.94245: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.94266: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.94269: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.94273: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.95140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.96924: done with get_vars() 44842 1727204531.96947: variable 'ansible_search_path' from source: unknown 44842 1727204531.96948: variable 'ansible_search_path' from source: unknown 44842 1727204531.96990: we have included files to process 44842 1727204531.96992: generating all_blocks data 44842 1727204531.96994: done generating all_blocks data 44842 1727204531.96995: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44842 1727204531.96996: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44842 1727204531.96998: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44842 1727204531.97202: done processing included file 44842 1727204531.97204: iterating over new_blocks loaded from include file 44842 1727204531.97206: in VariableManager get_vars() 44842 1727204531.97219: done with get_vars() 44842 1727204531.97220: filtering new block on tags 44842 1727204531.97242: done filtering new block on tags 44842 1727204531.97245: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 44842 1727204531.97251: extending task lists for all hosts with included blocks 44842 1727204531.97361: done extending task lists 44842 1727204531.97362: done processing included files 44842 1727204531.97363: results queue empty 44842 1727204531.97365: checking for any_errors_fatal 44842 1727204531.97369: done checking for any_errors_fatal 44842 1727204531.97369: checking for max_fail_percentage 44842 1727204531.97370: done checking for max_fail_percentage 44842 1727204531.97371: checking to see if all hosts have failed and the running result is not ok 44842 1727204531.97372: done checking to see if all hosts have failed 44842 1727204531.97373: getting the remaining hosts for this loop 44842 1727204531.97374: done getting the remaining hosts for this loop 44842 1727204531.97377: getting the next task for host managed-node1 44842 1727204531.97381: done getting next task for host managed-node1 44842 1727204531.97384: ^ task is: TASK: Get stat for interface {{ interface }} 44842 1727204531.97387: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204531.97389: getting variables 44842 1727204531.97390: in VariableManager get_vars() 44842 1727204531.97399: Calling all_inventory to load vars for managed-node1 44842 1727204531.97402: Calling groups_inventory to load vars for managed-node1 44842 1727204531.97404: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204531.97409: Calling all_plugins_play to load vars for managed-node1 44842 1727204531.97411: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204531.97414: Calling groups_plugins_play to load vars for managed-node1 44842 1727204531.98825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204531.99908: done with get_vars() 44842 1727204531.99926: done getting variables 44842 1727204532.00049: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:02:11 -0400 (0:00:00.071) 0:00:42.168 ***** 44842 1727204532.00078: entering _queue_task() for managed-node1/stat 44842 1727204532.00322: worker is 1 (out of 1 available) 44842 1727204532.00335: exiting _queue_task() for managed-node1/stat 44842 1727204532.00348: done queuing things up, now waiting for results queue to drain 44842 1727204532.00349: waiting for pending results... 44842 1727204532.00538: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 44842 1727204532.00622: in run() - task 0affcd87-79f5-aad0-d242-0000000006de 44842 1727204532.00634: variable 'ansible_search_path' from source: unknown 44842 1727204532.00638: variable 'ansible_search_path' from source: unknown 44842 1727204532.00668: calling self._execute() 44842 1727204532.00748: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.00751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.00766: variable 'omit' from source: magic vars 44842 1727204532.01155: variable 'ansible_distribution_major_version' from source: facts 44842 1727204532.01178: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204532.01189: variable 'omit' from source: magic vars 44842 1727204532.01243: variable 'omit' from source: magic vars 44842 1727204532.01355: variable 'interface' from source: set_fact 44842 1727204532.01386: variable 'omit' from source: magic vars 44842 1727204532.01436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204532.01484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204532.01511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204532.01544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204532.01566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204532.01606: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204532.01614: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.01622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.01735: Set connection var ansible_shell_type to sh 44842 1727204532.01755: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204532.01769: Set connection var ansible_connection to ssh 44842 1727204532.01779: Set connection var ansible_pipelining to False 44842 1727204532.01788: Set connection var ansible_timeout to 10 44842 1727204532.01806: Set connection var ansible_shell_executable to /bin/sh 44842 1727204532.01837: variable 'ansible_shell_executable' from source: unknown 44842 1727204532.01847: variable 'ansible_connection' from source: unknown 44842 1727204532.01857: variable 'ansible_module_compression' from source: unknown 44842 1727204532.01867: variable 'ansible_shell_type' from source: unknown 44842 1727204532.01874: variable 'ansible_shell_executable' from source: unknown 44842 1727204532.01880: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.01886: variable 'ansible_pipelining' from source: unknown 44842 1727204532.01892: variable 'ansible_timeout' from source: unknown 44842 1727204532.01898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.02121: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44842 1727204532.02140: variable 'omit' from source: magic vars 44842 1727204532.02149: starting attempt loop 44842 1727204532.02155: running the handler 44842 1727204532.02178: _low_level_execute_command(): starting 44842 1727204532.02193: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204532.03027: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.03042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.03056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.03084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.03133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.03145: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.03158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.03182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.03197: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.03210: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.03226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.03240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.03256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.03274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.03285: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.03299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.03390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.03416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.03439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.03531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.05186: stdout chunk (state=3): >>>/root <<< 44842 1727204532.05399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.05403: stdout chunk (state=3): >>><<< 44842 1727204532.05405: stderr chunk (state=3): >>><<< 44842 1727204532.05523: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.05527: _low_level_execute_command(): starting 44842 1727204532.05531: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534 `" && echo ansible-tmp-1727204532.0542598-48058-183764990112534="` echo /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534 `" ) && sleep 0' 44842 1727204532.06135: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.06151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.06175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.06195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.06238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.06251: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.06273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.06291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.06301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.06311: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.06322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.06334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.06347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.06357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.06373: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.06385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.06468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.06492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.06510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.06784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.08482: stdout chunk (state=3): >>>ansible-tmp-1727204532.0542598-48058-183764990112534=/root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534 <<< 44842 1727204532.08601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.08698: stderr chunk (state=3): >>><<< 44842 1727204532.08709: stdout chunk (state=3): >>><<< 44842 1727204532.08974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204532.0542598-48058-183764990112534=/root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.08978: variable 'ansible_module_compression' from source: unknown 44842 1727204532.08983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44842 1727204532.08985: variable 'ansible_facts' from source: unknown 44842 1727204532.09019: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534/AnsiballZ_stat.py 44842 1727204532.09187: Sending initial data 44842 1727204532.09191: Sent initial data (153 bytes) 44842 1727204532.10377: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.10394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.10409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.10436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.10485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.10498: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.10513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.10537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.10550: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.10568: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.10582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.10597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.10613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.10626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.10639: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.10658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.10738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.10773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.10792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.10894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.12619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204532.12673: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204532.12722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpt30to5pd /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534/AnsiballZ_stat.py <<< 44842 1727204532.12873: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204532.14172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.14325: stderr chunk (state=3): >>><<< 44842 1727204532.14328: stdout chunk (state=3): >>><<< 44842 1727204532.14331: done transferring module to remote 44842 1727204532.14333: _low_level_execute_command(): starting 44842 1727204532.14340: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534/ /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534/AnsiballZ_stat.py && sleep 0' 44842 1727204532.15173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.15189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.15202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.15219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.15272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.15284: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.15298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.15314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.15325: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.15335: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.15351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.15368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.15385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.15396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.15405: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.15417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.15504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.15520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.15533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.15646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.17391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.17471: stderr chunk (state=3): >>><<< 44842 1727204532.17486: stdout chunk (state=3): >>><<< 44842 1727204532.17590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.17594: _low_level_execute_command(): starting 44842 1727204532.17596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534/AnsiballZ_stat.py && sleep 0' 44842 1727204532.18251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.18271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.18287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.18307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.18349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.18375: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.18391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.18409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.18422: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.18433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.18446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.18465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.18489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.18501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.18512: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.18526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.18649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.18677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.18698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.18810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.31900: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44842 1727204532.32912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204532.32916: stdout chunk (state=3): >>><<< 44842 1727204532.32918: stderr chunk (state=3): >>><<< 44842 1727204532.33056: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204532.33071: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204532.33075: _low_level_execute_command(): starting 44842 1727204532.33078: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204532.0542598-48058-183764990112534/ > /dev/null 2>&1 && sleep 0' 44842 1727204532.33669: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.33685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.33699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.33716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.33759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.33776: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.33789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.33806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.33816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.33827: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.33837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.33851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.33879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.33891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.33901: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.33913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.33992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.34009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.34023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.34114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.35975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.35979: stdout chunk (state=3): >>><<< 44842 1727204532.35987: stderr chunk (state=3): >>><<< 44842 1727204532.36029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.36032: handler run complete 44842 1727204532.36058: attempt loop complete, returning result 44842 1727204532.36061: _execute() done 44842 1727204532.36069: dumping result to json 44842 1727204532.36074: done dumping result, returning 44842 1727204532.36084: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [0affcd87-79f5-aad0-d242-0000000006de] 44842 1727204532.36089: sending task result for task 0affcd87-79f5-aad0-d242-0000000006de 44842 1727204532.36192: done sending task result for task 0affcd87-79f5-aad0-d242-0000000006de 44842 1727204532.36195: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 44842 1727204532.36420: no more pending results, returning what we have 44842 1727204532.36424: results queue empty 44842 1727204532.36425: checking for any_errors_fatal 44842 1727204532.36427: done checking for any_errors_fatal 44842 1727204532.36427: checking for max_fail_percentage 44842 1727204532.36429: done checking for max_fail_percentage 44842 1727204532.36430: checking to see if all hosts have failed and the running result is not ok 44842 1727204532.36431: done checking to see if all hosts have failed 44842 1727204532.36432: getting the remaining hosts for this loop 44842 1727204532.36433: done getting the remaining hosts for this loop 44842 1727204532.36437: getting the next task for host managed-node1 44842 1727204532.36446: done getting next task for host managed-node1 44842 1727204532.36449: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 44842 1727204532.36452: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204532.36455: getting variables 44842 1727204532.36457: in VariableManager get_vars() 44842 1727204532.36494: Calling all_inventory to load vars for managed-node1 44842 1727204532.36497: Calling groups_inventory to load vars for managed-node1 44842 1727204532.36501: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204532.36512: Calling all_plugins_play to load vars for managed-node1 44842 1727204532.36515: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204532.36519: Calling groups_plugins_play to load vars for managed-node1 44842 1727204532.38190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204532.40415: done with get_vars() 44842 1727204532.40458: done getting variables 44842 1727204532.40576: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44842 1727204532.41024: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.409) 0:00:42.578 ***** 44842 1727204532.41061: entering _queue_task() for managed-node1/assert 44842 1727204532.41387: worker is 1 (out of 1 available) 44842 1727204532.41400: exiting _queue_task() for managed-node1/assert 44842 1727204532.41412: done queuing things up, now waiting for results queue to drain 44842 1727204532.41413: waiting for pending results... 44842 1727204532.41696: running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' 44842 1727204532.41806: in run() - task 0affcd87-79f5-aad0-d242-0000000006c5 44842 1727204532.41828: variable 'ansible_search_path' from source: unknown 44842 1727204532.41837: variable 'ansible_search_path' from source: unknown 44842 1727204532.41886: calling self._execute() 44842 1727204532.41997: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.42007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.42019: variable 'omit' from source: magic vars 44842 1727204532.42379: variable 'ansible_distribution_major_version' from source: facts 44842 1727204532.42400: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204532.42417: variable 'omit' from source: magic vars 44842 1727204532.42468: variable 'omit' from source: magic vars 44842 1727204532.42576: variable 'interface' from source: set_fact 44842 1727204532.42597: variable 'omit' from source: magic vars 44842 1727204532.42647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204532.42689: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204532.42714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204532.42740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204532.42756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204532.42797: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204532.42806: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.42814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.42917: Set connection var ansible_shell_type to sh 44842 1727204532.42932: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204532.42945: Set connection var ansible_connection to ssh 44842 1727204532.42954: Set connection var ansible_pipelining to False 44842 1727204532.42963: Set connection var ansible_timeout to 10 44842 1727204532.42977: Set connection var ansible_shell_executable to /bin/sh 44842 1727204532.43000: variable 'ansible_shell_executable' from source: unknown 44842 1727204532.43008: variable 'ansible_connection' from source: unknown 44842 1727204532.43017: variable 'ansible_module_compression' from source: unknown 44842 1727204532.43034: variable 'ansible_shell_type' from source: unknown 44842 1727204532.43044: variable 'ansible_shell_executable' from source: unknown 44842 1727204532.43053: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.43060: variable 'ansible_pipelining' from source: unknown 44842 1727204532.43069: variable 'ansible_timeout' from source: unknown 44842 1727204532.43077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.43213: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204532.43228: variable 'omit' from source: magic vars 44842 1727204532.43237: starting attempt loop 44842 1727204532.43244: running the handler 44842 1727204532.43404: variable 'interface_stat' from source: set_fact 44842 1727204532.43419: Evaluated conditional (not interface_stat.stat.exists): True 44842 1727204532.43429: handler run complete 44842 1727204532.43448: attempt loop complete, returning result 44842 1727204532.43454: _execute() done 44842 1727204532.43460: dumping result to json 44842 1727204532.43468: done dumping result, returning 44842 1727204532.43482: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' [0affcd87-79f5-aad0-d242-0000000006c5] 44842 1727204532.43492: sending task result for task 0affcd87-79f5-aad0-d242-0000000006c5 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 44842 1727204532.43632: no more pending results, returning what we have 44842 1727204532.43636: results queue empty 44842 1727204532.43637: checking for any_errors_fatal 44842 1727204532.43646: done checking for any_errors_fatal 44842 1727204532.43647: checking for max_fail_percentage 44842 1727204532.43649: done checking for max_fail_percentage 44842 1727204532.43650: checking to see if all hosts have failed and the running result is not ok 44842 1727204532.43651: done checking to see if all hosts have failed 44842 1727204532.43652: getting the remaining hosts for this loop 44842 1727204532.43654: done getting the remaining hosts for this loop 44842 1727204532.43658: getting the next task for host managed-node1 44842 1727204532.43669: done getting next task for host managed-node1 44842 1727204532.43672: ^ task is: TASK: Verify network state restored to default 44842 1727204532.43674: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204532.43678: getting variables 44842 1727204532.43680: in VariableManager get_vars() 44842 1727204532.43711: Calling all_inventory to load vars for managed-node1 44842 1727204532.43714: Calling groups_inventory to load vars for managed-node1 44842 1727204532.43718: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204532.43733: Calling all_plugins_play to load vars for managed-node1 44842 1727204532.43736: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204532.43742: Calling groups_plugins_play to load vars for managed-node1 44842 1727204532.44881: done sending task result for task 0affcd87-79f5-aad0-d242-0000000006c5 44842 1727204532.44885: WORKER PROCESS EXITING 44842 1727204532.45666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204532.47537: done with get_vars() 44842 1727204532.47569: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:236 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.065) 0:00:42.644 ***** 44842 1727204532.47655: entering _queue_task() for managed-node1/include_tasks 44842 1727204532.48088: worker is 1 (out of 1 available) 44842 1727204532.48099: exiting _queue_task() for managed-node1/include_tasks 44842 1727204532.48111: done queuing things up, now waiting for results queue to drain 44842 1727204532.48113: waiting for pending results... 44842 1727204532.48904: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 44842 1727204532.49112: in run() - task 0affcd87-79f5-aad0-d242-0000000000ab 44842 1727204532.49127: variable 'ansible_search_path' from source: unknown 44842 1727204532.49289: calling self._execute() 44842 1727204532.49524: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.49528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.49541: variable 'omit' from source: magic vars 44842 1727204532.50433: variable 'ansible_distribution_major_version' from source: facts 44842 1727204532.50437: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204532.50441: _execute() done 44842 1727204532.50447: dumping result to json 44842 1727204532.50590: done dumping result, returning 44842 1727204532.50594: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [0affcd87-79f5-aad0-d242-0000000000ab] 44842 1727204532.50602: sending task result for task 0affcd87-79f5-aad0-d242-0000000000ab 44842 1727204532.50695: done sending task result for task 0affcd87-79f5-aad0-d242-0000000000ab 44842 1727204532.50699: WORKER PROCESS EXITING 44842 1727204532.50727: no more pending results, returning what we have 44842 1727204532.50733: in VariableManager get_vars() 44842 1727204532.50775: Calling all_inventory to load vars for managed-node1 44842 1727204532.50778: Calling groups_inventory to load vars for managed-node1 44842 1727204532.50782: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204532.50796: Calling all_plugins_play to load vars for managed-node1 44842 1727204532.50799: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204532.50803: Calling groups_plugins_play to load vars for managed-node1 44842 1727204532.52402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204532.54541: done with get_vars() 44842 1727204532.54570: variable 'ansible_search_path' from source: unknown 44842 1727204532.54586: we have included files to process 44842 1727204532.54587: generating all_blocks data 44842 1727204532.54589: done generating all_blocks data 44842 1727204532.54592: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44842 1727204532.54593: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44842 1727204532.54596: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44842 1727204532.55094: done processing included file 44842 1727204532.55096: iterating over new_blocks loaded from include file 44842 1727204532.55098: in VariableManager get_vars() 44842 1727204532.55109: done with get_vars() 44842 1727204532.55111: filtering new block on tags 44842 1727204532.55127: done filtering new block on tags 44842 1727204532.55129: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 44842 1727204532.55135: extending task lists for all hosts with included blocks 44842 1727204532.55624: done extending task lists 44842 1727204532.55626: done processing included files 44842 1727204532.55626: results queue empty 44842 1727204532.55627: checking for any_errors_fatal 44842 1727204532.55630: done checking for any_errors_fatal 44842 1727204532.55630: checking for max_fail_percentage 44842 1727204532.55631: done checking for max_fail_percentage 44842 1727204532.55632: checking to see if all hosts have failed and the running result is not ok 44842 1727204532.55633: done checking to see if all hosts have failed 44842 1727204532.55633: getting the remaining hosts for this loop 44842 1727204532.55634: done getting the remaining hosts for this loop 44842 1727204532.55637: getting the next task for host managed-node1 44842 1727204532.55641: done getting next task for host managed-node1 44842 1727204532.55643: ^ task is: TASK: Check routes and DNS 44842 1727204532.55645: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204532.55647: getting variables 44842 1727204532.55648: in VariableManager get_vars() 44842 1727204532.55656: Calling all_inventory to load vars for managed-node1 44842 1727204532.55658: Calling groups_inventory to load vars for managed-node1 44842 1727204532.55661: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204532.55669: Calling all_plugins_play to load vars for managed-node1 44842 1727204532.55671: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204532.55673: Calling groups_plugins_play to load vars for managed-node1 44842 1727204532.57359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204532.59254: done with get_vars() 44842 1727204532.59288: done getting variables 44842 1727204532.59374: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:02:12 -0400 (0:00:00.117) 0:00:42.762 ***** 44842 1727204532.59405: entering _queue_task() for managed-node1/shell 44842 1727204532.59734: worker is 1 (out of 1 available) 44842 1727204532.59746: exiting _queue_task() for managed-node1/shell 44842 1727204532.59758: done queuing things up, now waiting for results queue to drain 44842 1727204532.59759: waiting for pending results... 44842 1727204532.60604: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 44842 1727204532.60745: in run() - task 0affcd87-79f5-aad0-d242-0000000006f6 44842 1727204532.60776: variable 'ansible_search_path' from source: unknown 44842 1727204532.60784: variable 'ansible_search_path' from source: unknown 44842 1727204532.60824: calling self._execute() 44842 1727204532.61069: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.61084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.61099: variable 'omit' from source: magic vars 44842 1727204532.61586: variable 'ansible_distribution_major_version' from source: facts 44842 1727204532.61651: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204532.61676: variable 'omit' from source: magic vars 44842 1727204532.61724: variable 'omit' from source: magic vars 44842 1727204532.61842: variable 'omit' from source: magic vars 44842 1727204532.61892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204532.61944: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204532.61984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204532.62013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204532.62031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204532.62080: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204532.62097: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.62106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.62236: Set connection var ansible_shell_type to sh 44842 1727204532.62252: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204532.62273: Set connection var ansible_connection to ssh 44842 1727204532.62284: Set connection var ansible_pipelining to False 44842 1727204532.62294: Set connection var ansible_timeout to 10 44842 1727204532.62312: Set connection var ansible_shell_executable to /bin/sh 44842 1727204532.62337: variable 'ansible_shell_executable' from source: unknown 44842 1727204532.62345: variable 'ansible_connection' from source: unknown 44842 1727204532.62353: variable 'ansible_module_compression' from source: unknown 44842 1727204532.62362: variable 'ansible_shell_type' from source: unknown 44842 1727204532.62376: variable 'ansible_shell_executable' from source: unknown 44842 1727204532.62383: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204532.62390: variable 'ansible_pipelining' from source: unknown 44842 1727204532.62398: variable 'ansible_timeout' from source: unknown 44842 1727204532.62405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204532.62553: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204532.62586: variable 'omit' from source: magic vars 44842 1727204532.62600: starting attempt loop 44842 1727204532.62607: running the handler 44842 1727204532.62622: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204532.62651: _low_level_execute_command(): starting 44842 1727204532.62668: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204532.64450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.64466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.64477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.64499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.64537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.64544: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.64554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.64573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.64582: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.64591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.64604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.64618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.64629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.64637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.64644: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.64653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.64736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.64752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.64756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.64844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.66380: stdout chunk (state=3): >>>/root <<< 44842 1727204532.66572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.66575: stdout chunk (state=3): >>><<< 44842 1727204532.66578: stderr chunk (state=3): >>><<< 44842 1727204532.66695: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.66699: _low_level_execute_command(): starting 44842 1727204532.66709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962 `" && echo ansible-tmp-1727204532.6660058-48140-42203488866962="` echo /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962 `" ) && sleep 0' 44842 1727204532.67300: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.67314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.67330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.67348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.67397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.67410: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.67425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.67443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.67456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.67479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.67492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.67507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.67523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.67537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.67549: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.67567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.67641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.67669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.67687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.67779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.69605: stdout chunk (state=3): >>>ansible-tmp-1727204532.6660058-48140-42203488866962=/root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962 <<< 44842 1727204532.69722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.69850: stderr chunk (state=3): >>><<< 44842 1727204532.69867: stdout chunk (state=3): >>><<< 44842 1727204532.70076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204532.6660058-48140-42203488866962=/root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.70079: variable 'ansible_module_compression' from source: unknown 44842 1727204532.70082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204532.70084: variable 'ansible_facts' from source: unknown 44842 1727204532.70150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962/AnsiballZ_command.py 44842 1727204532.70351: Sending initial data 44842 1727204532.70354: Sent initial data (155 bytes) 44842 1727204532.71606: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.71624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.71640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.71663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.71708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.71738: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.71755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.71780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.71793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.71805: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.71829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.71849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.71876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.71890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.71902: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.71917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.72011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.72033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.72057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.72154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.73847: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204532.73896: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204532.73950: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpj59uo79y /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962/AnsiballZ_command.py <<< 44842 1727204532.74000: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204532.75252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.75505: stderr chunk (state=3): >>><<< 44842 1727204532.75508: stdout chunk (state=3): >>><<< 44842 1727204532.75510: done transferring module to remote 44842 1727204532.75512: _low_level_execute_command(): starting 44842 1727204532.75515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962/ /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962/AnsiballZ_command.py && sleep 0' 44842 1727204532.76115: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.76128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.76142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.76159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.76206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.76219: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.76233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.76250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.76268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.76283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.76294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.76307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.76321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.76332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.76342: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.76354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.76434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.76451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.76469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.76562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.78274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.78356: stderr chunk (state=3): >>><<< 44842 1727204532.78362: stdout chunk (state=3): >>><<< 44842 1727204532.78459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.78468: _low_level_execute_command(): starting 44842 1727204532.78471: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962/AnsiballZ_command.py && sleep 0' 44842 1727204532.79075: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.79091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.79107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.79127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.79177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.79190: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.79205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.79223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.79236: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.79249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.79266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.79282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.79298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.79309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.79320: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.79333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.79413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.79435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.79450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.79554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.93293: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2835sec preferred_lft 2835sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\n21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:02:12.924063", "end": "2024-09-24 15:02:12.931886", "delta": "0:00:00.007823", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204532.94447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204532.94616: stdout chunk (state=3): >>><<< 44842 1727204532.94621: stderr chunk (state=3): >>><<< 44842 1727204532.94624: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2835sec preferred_lft 2835sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\n21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:02:12.924063", "end": "2024-09-24 15:02:12.931886", "delta": "0:00:00.007823", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204532.94627: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204532.94629: _low_level_execute_command(): starting 44842 1727204532.94631: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204532.6660058-48140-42203488866962/ > /dev/null 2>&1 && sleep 0' 44842 1727204532.95216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204532.95230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.95244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.95259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.95303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.95314: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204532.95326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.95342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204532.95351: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204532.95369: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204532.95381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204532.95393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204532.95406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204532.95415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204532.95425: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204532.95437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204532.95517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204532.95532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204532.95545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204532.95631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204532.97426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204532.97430: stdout chunk (state=3): >>><<< 44842 1727204532.97437: stderr chunk (state=3): >>><<< 44842 1727204532.97454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204532.97463: handler run complete 44842 1727204532.97488: Evaluated conditional (False): False 44842 1727204532.97499: attempt loop complete, returning result 44842 1727204532.97503: _execute() done 44842 1727204532.97506: dumping result to json 44842 1727204532.97515: done dumping result, returning 44842 1727204532.97518: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [0affcd87-79f5-aad0-d242-0000000006f6] 44842 1727204532.97524: sending task result for task 0affcd87-79f5-aad0-d242-0000000006f6 44842 1727204532.97640: done sending task result for task 0affcd87-79f5-aad0-d242-0000000006f6 44842 1727204532.97642: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.007823", "end": "2024-09-24 15:02:12.931886", "rc": 0, "start": "2024-09-24 15:02:12.924063" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2835sec preferred_lft 2835sec inet6 fe80::108f:92ff:fee7:c1ab/64 scope link valid_lft forever preferred_lft forever 21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 44842 1727204532.97714: no more pending results, returning what we have 44842 1727204532.97717: results queue empty 44842 1727204532.97718: checking for any_errors_fatal 44842 1727204532.97720: done checking for any_errors_fatal 44842 1727204532.97720: checking for max_fail_percentage 44842 1727204532.97722: done checking for max_fail_percentage 44842 1727204532.97723: checking to see if all hosts have failed and the running result is not ok 44842 1727204532.97723: done checking to see if all hosts have failed 44842 1727204532.97724: getting the remaining hosts for this loop 44842 1727204532.97726: done getting the remaining hosts for this loop 44842 1727204532.97730: getting the next task for host managed-node1 44842 1727204532.97737: done getting next task for host managed-node1 44842 1727204532.97740: ^ task is: TASK: Verify DNS and network connectivity 44842 1727204532.97743: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204532.97746: getting variables 44842 1727204532.97748: in VariableManager get_vars() 44842 1727204532.97783: Calling all_inventory to load vars for managed-node1 44842 1727204532.97786: Calling groups_inventory to load vars for managed-node1 44842 1727204532.97789: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204532.97799: Calling all_plugins_play to load vars for managed-node1 44842 1727204532.97801: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204532.97804: Calling groups_plugins_play to load vars for managed-node1 44842 1727204532.99675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204533.01872: done with get_vars() 44842 1727204533.01909: done getting variables 44842 1727204533.01977: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:02:13 -0400 (0:00:00.426) 0:00:43.188 ***** 44842 1727204533.02013: entering _queue_task() for managed-node1/shell 44842 1727204533.02395: worker is 1 (out of 1 available) 44842 1727204533.02407: exiting _queue_task() for managed-node1/shell 44842 1727204533.02420: done queuing things up, now waiting for results queue to drain 44842 1727204533.02421: waiting for pending results... 44842 1727204533.02763: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 44842 1727204533.02905: in run() - task 0affcd87-79f5-aad0-d242-0000000006f7 44842 1727204533.02924: variable 'ansible_search_path' from source: unknown 44842 1727204533.02931: variable 'ansible_search_path' from source: unknown 44842 1727204533.02975: calling self._execute() 44842 1727204533.03127: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204533.03139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204533.03179: variable 'omit' from source: magic vars 44842 1727204533.03619: variable 'ansible_distribution_major_version' from source: facts 44842 1727204533.03629: Evaluated conditional (ansible_distribution_major_version != '6'): True 44842 1727204533.03729: variable 'ansible_facts' from source: unknown 44842 1727204533.04216: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 44842 1727204533.04223: variable 'omit' from source: magic vars 44842 1727204533.04250: variable 'omit' from source: magic vars 44842 1727204533.04273: variable 'omit' from source: magic vars 44842 1727204533.04308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44842 1727204533.04337: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44842 1727204533.04356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44842 1727204533.04372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204533.04381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44842 1727204533.04405: variable 'inventory_hostname' from source: host vars for 'managed-node1' 44842 1727204533.04409: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204533.04413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204533.04483: Set connection var ansible_shell_type to sh 44842 1727204533.04491: Set connection var ansible_module_compression to ZIP_DEFLATED 44842 1727204533.04496: Set connection var ansible_connection to ssh 44842 1727204533.04501: Set connection var ansible_pipelining to False 44842 1727204533.04506: Set connection var ansible_timeout to 10 44842 1727204533.04513: Set connection var ansible_shell_executable to /bin/sh 44842 1727204533.04531: variable 'ansible_shell_executable' from source: unknown 44842 1727204533.04534: variable 'ansible_connection' from source: unknown 44842 1727204533.04538: variable 'ansible_module_compression' from source: unknown 44842 1727204533.04540: variable 'ansible_shell_type' from source: unknown 44842 1727204533.04543: variable 'ansible_shell_executable' from source: unknown 44842 1727204533.04545: variable 'ansible_host' from source: host vars for 'managed-node1' 44842 1727204533.04547: variable 'ansible_pipelining' from source: unknown 44842 1727204533.04549: variable 'ansible_timeout' from source: unknown 44842 1727204533.04554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 44842 1727204533.04656: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204533.04667: variable 'omit' from source: magic vars 44842 1727204533.04675: starting attempt loop 44842 1727204533.04679: running the handler 44842 1727204533.04688: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44842 1727204533.04702: _low_level_execute_command(): starting 44842 1727204533.04708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44842 1727204533.05434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.05478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204533.05527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204533.05568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204533.07082: stdout chunk (state=3): >>>/root <<< 44842 1727204533.07267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204533.07271: stdout chunk (state=3): >>><<< 44842 1727204533.07275: stderr chunk (state=3): >>><<< 44842 1727204533.07375: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204533.07379: _low_level_execute_command(): starting 44842 1727204533.07381: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148 `" && echo ansible-tmp-1727204533.07296-48172-132411567008148="` echo /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148 `" ) && sleep 0' 44842 1727204533.07905: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204533.07942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.07955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.07996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.08000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.08073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.08119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204533.08193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204533.10013: stdout chunk (state=3): >>>ansible-tmp-1727204533.07296-48172-132411567008148=/root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148 <<< 44842 1727204533.10198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204533.10202: stdout chunk (state=3): >>><<< 44842 1727204533.10204: stderr chunk (state=3): >>><<< 44842 1727204533.10270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204533.07296-48172-132411567008148=/root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204533.10274: variable 'ansible_module_compression' from source: unknown 44842 1727204533.10375: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44842e33nar6b/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44842 1727204533.10471: variable 'ansible_facts' from source: unknown 44842 1727204533.10484: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148/AnsiballZ_command.py 44842 1727204533.10650: Sending initial data 44842 1727204533.10653: Sent initial data (154 bytes) 44842 1727204533.11762: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204533.11780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204533.11800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.11817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.11875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204533.11888: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204533.11909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.11928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204533.11939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204533.11950: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204533.11968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204533.11983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.11997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.12015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204533.12025: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204533.12038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.12132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204533.12153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204533.12174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204533.12269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204533.13944: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44842 1727204533.13996: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 44842 1727204533.14044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44842e33nar6b/tmpymmaif11 /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148/AnsiballZ_command.py <<< 44842 1727204533.14092: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 44842 1727204533.15343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204533.15476: stderr chunk (state=3): >>><<< 44842 1727204533.15479: stdout chunk (state=3): >>><<< 44842 1727204533.15481: done transferring module to remote 44842 1727204533.15483: _low_level_execute_command(): starting 44842 1727204533.15486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148/ /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148/AnsiballZ_command.py && sleep 0' 44842 1727204533.16052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204533.16058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.16068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.16097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.16100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.16106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.16153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204533.16158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204533.16224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204533.17928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204533.18000: stderr chunk (state=3): >>><<< 44842 1727204533.18004: stdout chunk (state=3): >>><<< 44842 1727204533.18104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204533.18108: _low_level_execute_command(): starting 44842 1727204533.18111: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148/AnsiballZ_command.py && sleep 0' 44842 1727204533.18711: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204533.18727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204533.18743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.18784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.18826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204533.18840: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204533.18854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.18884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204533.18901: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204533.18912: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204533.18923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204533.18935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.18948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.18965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204533.18982: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204533.18998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.19085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204533.19110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204533.19129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204533.19229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204533.64911: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3244 0 --:--:-- --:--:-- --:--:-- 3279\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1399", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:02:13.324151", "end": "2024-09-24 15:02:13.648187", "delta": "0:00:00.324036", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44842 1727204533.66192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 44842 1727204533.66196: stdout chunk (state=3): >>><<< 44842 1727204533.66198: stderr chunk (state=3): >>><<< 44842 1727204533.66355: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3244 0 --:--:-- --:--:-- --:--:-- 3279\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1399", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:02:13.324151", "end": "2024-09-24 15:02:13.648187", "delta": "0:00:00.324036", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 44842 1727204533.66371: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44842 1727204533.66374: _low_level_execute_command(): starting 44842 1727204533.66377: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204533.07296-48172-132411567008148/ > /dev/null 2>&1 && sleep 0' 44842 1727204533.67515: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44842 1727204533.67531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204533.67546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.67570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.67614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204533.67626: stderr chunk (state=3): >>>debug2: match not found <<< 44842 1727204533.67641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.67658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44842 1727204533.67675: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 44842 1727204533.67684: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44842 1727204533.67694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44842 1727204533.67705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44842 1727204533.67719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44842 1727204533.67729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 44842 1727204533.67738: stderr chunk (state=3): >>>debug2: match found <<< 44842 1727204533.67752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44842 1727204533.67832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 44842 1727204533.67855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44842 1727204533.67878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44842 1727204533.67972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44842 1727204533.69828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44842 1727204533.69832: stdout chunk (state=3): >>><<< 44842 1727204533.69834: stderr chunk (state=3): >>><<< 44842 1727204533.70272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44842 1727204533.70276: handler run complete 44842 1727204533.70279: Evaluated conditional (False): False 44842 1727204533.70281: attempt loop complete, returning result 44842 1727204533.70283: _execute() done 44842 1727204533.70285: dumping result to json 44842 1727204533.70287: done dumping result, returning 44842 1727204533.70289: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [0affcd87-79f5-aad0-d242-0000000006f7] 44842 1727204533.70291: sending task result for task 0affcd87-79f5-aad0-d242-0000000006f7 44842 1727204533.70376: done sending task result for task 0affcd87-79f5-aad0-d242-0000000006f7 44842 1727204533.70380: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.324036", "end": "2024-09-24 15:02:13.648187", "rc": 0, "start": "2024-09-24 15:02:13.324151" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3244 0 --:--:-- --:--:-- --:--:-- 3279 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1399 44842 1727204533.70450: no more pending results, returning what we have 44842 1727204533.70462: results queue empty 44842 1727204533.70463: checking for any_errors_fatal 44842 1727204533.70477: done checking for any_errors_fatal 44842 1727204533.70478: checking for max_fail_percentage 44842 1727204533.70480: done checking for max_fail_percentage 44842 1727204533.70480: checking to see if all hosts have failed and the running result is not ok 44842 1727204533.70481: done checking to see if all hosts have failed 44842 1727204533.70482: getting the remaining hosts for this loop 44842 1727204533.70484: done getting the remaining hosts for this loop 44842 1727204533.70488: getting the next task for host managed-node1 44842 1727204533.70497: done getting next task for host managed-node1 44842 1727204533.70499: ^ task is: TASK: meta (flush_handlers) 44842 1727204533.70500: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204533.70505: getting variables 44842 1727204533.70507: in VariableManager get_vars() 44842 1727204533.70535: Calling all_inventory to load vars for managed-node1 44842 1727204533.70538: Calling groups_inventory to load vars for managed-node1 44842 1727204533.70542: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204533.70553: Calling all_plugins_play to load vars for managed-node1 44842 1727204533.70556: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204533.70562: Calling groups_plugins_play to load vars for managed-node1 44842 1727204533.72155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204533.75540: done with get_vars() 44842 1727204533.75570: done getting variables 44842 1727204533.75645: in VariableManager get_vars() 44842 1727204533.75656: Calling all_inventory to load vars for managed-node1 44842 1727204533.75659: Calling groups_inventory to load vars for managed-node1 44842 1727204533.75665: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204533.75670: Calling all_plugins_play to load vars for managed-node1 44842 1727204533.75673: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204533.75676: Calling groups_plugins_play to load vars for managed-node1 44842 1727204533.81970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204533.83433: done with get_vars() 44842 1727204533.83467: done queuing things up, now waiting for results queue to drain 44842 1727204533.83469: results queue empty 44842 1727204533.83470: checking for any_errors_fatal 44842 1727204533.83473: done checking for any_errors_fatal 44842 1727204533.83473: checking for max_fail_percentage 44842 1727204533.83474: done checking for max_fail_percentage 44842 1727204533.83474: checking to see if all hosts have failed and the running result is not ok 44842 1727204533.83475: done checking to see if all hosts have failed 44842 1727204533.83475: getting the remaining hosts for this loop 44842 1727204533.83476: done getting the remaining hosts for this loop 44842 1727204533.83478: getting the next task for host managed-node1 44842 1727204533.83481: done getting next task for host managed-node1 44842 1727204533.83482: ^ task is: TASK: meta (flush_handlers) 44842 1727204533.83483: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204533.83485: getting variables 44842 1727204533.83486: in VariableManager get_vars() 44842 1727204533.83492: Calling all_inventory to load vars for managed-node1 44842 1727204533.83494: Calling groups_inventory to load vars for managed-node1 44842 1727204533.83495: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204533.83500: Calling all_plugins_play to load vars for managed-node1 44842 1727204533.83502: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204533.83503: Calling groups_plugins_play to load vars for managed-node1 44842 1727204533.84246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204533.86384: done with get_vars() 44842 1727204533.86415: done getting variables 44842 1727204533.86472: in VariableManager get_vars() 44842 1727204533.86490: Calling all_inventory to load vars for managed-node1 44842 1727204533.86493: Calling groups_inventory to load vars for managed-node1 44842 1727204533.86496: Calling all_plugins_inventory to load vars for managed-node1 44842 1727204533.86501: Calling all_plugins_play to load vars for managed-node1 44842 1727204533.86503: Calling groups_plugins_inventory to load vars for managed-node1 44842 1727204533.86506: Calling groups_plugins_play to load vars for managed-node1 44842 1727204533.87889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44842 1727204533.90287: done with get_vars() 44842 1727204533.90324: done queuing things up, now waiting for results queue to drain 44842 1727204533.90326: results queue empty 44842 1727204533.90327: checking for any_errors_fatal 44842 1727204533.90329: done checking for any_errors_fatal 44842 1727204533.90329: checking for max_fail_percentage 44842 1727204533.90330: done checking for max_fail_percentage 44842 1727204533.90331: checking to see if all hosts have failed and the running result is not ok 44842 1727204533.90332: done checking to see if all hosts have failed 44842 1727204533.90333: getting the remaining hosts for this loop 44842 1727204533.90334: done getting the remaining hosts for this loop 44842 1727204533.90337: getting the next task for host managed-node1 44842 1727204533.90340: done getting next task for host managed-node1 44842 1727204533.90341: ^ task is: None 44842 1727204533.90343: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44842 1727204533.90344: done queuing things up, now waiting for results queue to drain 44842 1727204533.90345: results queue empty 44842 1727204533.90346: checking for any_errors_fatal 44842 1727204533.90346: done checking for any_errors_fatal 44842 1727204533.90347: checking for max_fail_percentage 44842 1727204533.90348: done checking for max_fail_percentage 44842 1727204533.90349: checking to see if all hosts have failed and the running result is not ok 44842 1727204533.90349: done checking to see if all hosts have failed 44842 1727204533.90350: getting the next task for host managed-node1 44842 1727204533.90357: done getting next task for host managed-node1 44842 1727204533.90358: ^ task is: None 44842 1727204533.90359: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=87 changed=5 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Tuesday 24 September 2024 15:02:13 -0400 (0:00:00.884) 0:00:44.073 ***** =============================================================================== Gathering Facts --------------------------------------------------------- 1.89s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.70s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.67s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.58s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 1.53s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.46s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Create veth interface ethtest0 ------------------------------------------ 1.17s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.10s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.03s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 0.97s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.93s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.88s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.88s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.88s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Gathering Facts --------------------------------------------------------- 0.87s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.85s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.74s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather current interface info ------------------------------------------- 0.70s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Check if system is ostree ----------------------------------------------- 0.70s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 44842 1727204533.90536: RUNNING CLEANUP